Google’s Depth API makes ARCore applications more realistic

2
196
Google's Depth API enhances AR applications.
Image: YouTube | Google Developers

Google is leading the way for the development of augmented reality (AR) with its ARCore platform. It allows developers to create and integrate AR experiences like never before. Now, it is taking things a step further.

The company recently announced a feature that will allow any ARCore-enabled smartphone to sense depth—even those with only one camera. This allows for much more realistic AR rendering and will give developers a new tool to enhance the experience of users.

Bridging the Reality Gap

Previously released AR integrations from Google are already very impressive. For one, the addition of AR in Google Search has allowed users to invite wild animals into their living room. The ARCore feature overlays a life-sized rendering of animals like tigers, sharks, and penguins atop a live feed from the device’s camera.

With the addition of depth sensing, the illusion has only gotten more realistic. A set of images from Google displays the effect in action with an AR cat hiding halfway behind a couch.

GIF: Google
GIF: Google

Compared with the first image, where the cat appears to be overlapping the couch, the second image is more realistic. On a quick glance, it could easily be mistaken for a real photo.

A company blog post details exactly how the technology works. It says, “The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.”

Notably, the feature will be available on all Android smartphones that are equipped to handle Google’s ARCore. Of course, that includes flagship models from top makers like Samsung and OnePlus. It also includes Google’s in-house Pixel line and many mid-range models from other sellers. The number of devices capable of using the new Depth API easily surpasses 200. Users are able to turn the feature, labeled as “occlusion,” on and off as they desire.

Houzz Integration

Though making animals in your home appear more realistic is cool, there are some more practical features planned for the new depth tracking addition. One of these is an integration with the Houzz app.

Anyone can use Houzz to visualize what their home would look like with new furniture or new organization. The Depth API gives it the power to render objects in a more realistic fashion to provide a better picture of what a room will look like with that new end table.

GIF: Google

Surface Interaction Potential

It has already been proven by the likes of “Pokémon Go” that AR and video games are a perfect match. Google’s depth addition will integrate nicely with future mobile games. That’s thanks to its ability to understand real-world physics and surface interactions.

Developers can use the API to create AR worlds that respect the angles and curvature of the real one. Google demoed a game that allows users to step into a food fight and hide behind real-world objects for shelter. Food splatters realistically on top of various objects in the room at believable 3D depths.

Thanks to the fact that this feature is available on devices that only have one camera, it should catch on pretty quickly. After all, that is what most of society is using. If the lightning-fast adoption of Google’s AR Search feature is any indication, Depth API will be a hit with users and developers alike.

2 COMMENTS

  1. I hope that you won’t stop writing such interesting articles. I’m waiting for more of your content. It’s so good that i’m going follow you!

LEAVE A REPLY

Please enter your comment!
Please enter your name here