Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

ARCore Depth API: How it will fundamentally transform your AR experiences

At first, this seems like no big deal, but when you learn what it does you'll see why it's pretty exciting.
By

Published onDecember 9, 2019

ARCore Depth API

Today, Google is taking the wraps off its new ARCore Depth API. At first glance, this sounds highly technical and uninteresting. However, when you understand what it does, you’ll see how this will fundamentally change your augmented reality experiences.

You’ll also see how it will open up tons of new possibilities for AR in the worlds of productivity, shopping, and even gaming.

So what is the ARCore Depth API? Here’s Google’s official explanation:

Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.

Confused? It’s way easier to explain what it is by showing you what it does. Check out the GIFs below: on the left, you have an AR experience without the Depth API and, on the right, that same experience with it.

The ARCore Depth API allows the AR program to understand that the fluffy pillows in the room above are closer to you than the placement of the AR cat. Previously, ARCore wasn’t very good at determining this and would place the cat right on top of the pillows, creating a wholly unrealistic scene. With Depth API active, though, the cat’s body is behind the pillows and only the parts you would see in a real-world situation are visible.

Google explains in its blog post announcing Depth API how this works. It’s pretty technical, and you can feel free to learn all about it by reading the post, but the image below gives you a solid idea. The Depth API uses your camera movements to determine which objects in your view are closer or further away, and then creates a depth map:

ARCore Depth API Depth Map

In the GIF, once the depth map is created, objects that are closer to you appear in red while objects that are far away appear in blue.

With the ARCore Depth API, AR apps will be much more realistic. When you use AR-powered shopping apps, for example, you can place household items in your home to get a sense of what they’d look like in your living room or on your counter. This new feature will make those experiences even more realistic, giving you more confidence about your purchase.

For gaming, a better sense of depth will allow you to do things such as hiding behind obstacles, accurately aiming projectiles, and getting a surprise when characters come out from behind structures. In the GIF at the top of this article, you can see an example of how this could work.

Related: Ten best augmented reality apps and AR apps for Android

The Depth API is not dependent on special cameras and sensors, so it should work on pretty much any device that supports ARCore. However, devices with specialized cameras and time-of-flight (ToF) sensors will likely get a better and more accurate experience.

Google is hoping that developers will be excited to try out this new feature and integrate it into their AR-powered applications. It shouldn’t be too long before you start seeing better depth experiences in your current AR apps.

You might like