X

Android XR apps will have camera permissions similar to Android phones

Google introduced Android XR, its upcoming extended (XR) operating system, for virtual reality and mixed reality-based headsets in December last year. Soon after, Samsung introduced its first Android XR headset prototype called Project Moohan. Android XR will finally commercially launch in 2025 on a headset manufactured by Samsung and a pair of smart glasses developed by Google’s DeepMind subsidiary. Google hasn’t yet revealed a lot of details about the platform. However, a developer has revealed that Android XR apps will have camera permissions near-identical to the ones for Android phones.

Android XR app developers can request permission to access the headset’s cameras

Android developer Antony Vitillo aka Skarred Ghost asked Google whether developers would get permission to access the camera of the Android XR headsets. A Google representative reportedly responded that “similar to any Android app, a developer can use existing camera frames with user permission for XR”.

Furthermore, the Android XR apps can seek access to the “main world-facing camera system” and the “main selfie-camera system” that points at your face. As per the source, Google has used this to access the “rear” and “front cameras” on an Android smartphone. Notably, Meta’s Quest 3 and Apple’s Vision Pro still don’t allow third-party app developers to directly access the cameras.

Android XR will give devs access to your living room feed

As per the source, Google will give developers access to your living room feed through the Android XR headset’s cameras. Headsets like Samsung’s Project Mohan will be able to analyze your living room surroundings and adjust mixed-reality games and applications using that context. Android XR app developers will be able to use this feed to optimize their applications accordingly to provide the best mixed reality experience.

According to the Android Developers website, devs can request access to “Scene Understanding”. They can use cases like “Light estimation”, which is projecting camera passthrough feed onto mesh surfaces. Also, they can perform ray casts against trackables in the environment.

The devs can request advanced tracking through the rear camera of the Android XR headsets for “hand joint poses and angular and linear velocities”. It will create a “mesh representation of the user’s hands”, which souls like it’ll boost immersion and make hand-tracked VR games more enjoyable.

That said, similar to other VR or XR headsets, Android XR-based devices will offer “basic” hand tracking by default like “pinching, poking, aiming, and gripping”. Google will reveal more details about the upcoming XR platform in the coming weeks.