Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

Meta's breakthrough could make manual VR room scanning obsolete

There's no word on if Meta plans to bring the tech to the Quest 3.
By

Published onMarch 22, 2024

SceneScript
Meta
TL;DR
  • Meta has introduced a new way to digitally map a room, called SceneScript.
  • SceneScript is able to automatically identify objects and room features like walls, couches, and tables with the help of AI.
  • The resulting data can be used by LLMs to answer questions about the room.

Meta has debuted a new method for scanning a room called SceneScript. This new way of room scanning could make the process faster and easier on Quest headsets, and it could have other useful implications as well.

Before you use a VR headset, it’s recommended that you scan the room so that boundaries can be set up. This way you don’t have to worry about bumping into various objects as your vision is obscured. Room scanning is also used for AR to accurately anchor and align 3D content to the indoor space you’re in. However, the current way room scanning is done can be a bit complex and user-unfriendly.

In a detailed Threads post, Meta Research Project Manager Edward Miller introduced SceneScript, which aims to solve this problem. According to Miller, the tech uses AI to automatically identify objects and room features such as walls, couches, tables, ceilings, and more. So instead of having to manually map out these things, the AI does it for you as you look around.

SceneScript 2
Meta

Apple’s Vision Pro is also able to automatically map a room. However, the headset isn’t able to identify and label objects like SceneScript can.

In addition to doing the room mapping work for you, Miller says the data can be used with chatbots like Llama 2 to pose questions about the area. For example, the user could ask “How many pots of paint would it take to paint this room?” Llama 2 would presumably be able to answer these questions if you choose to share the room data with the LLM.

In a blog post, Meta says SceneScript could “unlock key use cases for both MR headsets and future AR glasses,” like providing step-by-step navigation for the visually impaired. The company also states the technology could “unlock the potential of next-generation digital assistants, providing them with the physical-world context necessary to answer complex spatial queries.”

Meta doesn’t say if it will bring the technology to the Meta Quest 3 or the incoming Meta Quest 3 Lite. But this would be a big upgrade for the headset’s Guardian feature.

Got a tip? Talk to us! Email our staff at news@androidauthority.com. You can stay anonymous or get credit for the info, it's your choice.

You might like