Man, so Google’s at it again, huh? They just dropped their second Developer Preview for the Android XR SDK or whatever — no idea why this stuff always feels like it’s from a sci-fi movie. So, yeah, this thing’s got new features, some improvements here and there. What’s in the mix? You’ve got your immersive video being more widely supported, better adaptive UI layouts, and some hand-tracking shenanigans in ARCore for Jetpack. Wild, right?
Anyway, it got announced at Google I/O. Yup, they just rolled out a laundry list of updates to the Android XR SDK. They say it’s to give devs more standardized tools. So, either create their own XR-crazy apps or just bring normal Android apps to headsets. Maybe this is the future, or maybe it’s just another phase. Who knows?
Now, let’s talk video. This update has support for 180° and 360° stereoscopic video playback. That’s using the MV-HEVC format, which, frankly, sounds like a secret code but is apparently a popular codec for top-notch 3D immersive video. No kidding.
Oh, and guess what? Jetpack Compose for XR is in the mix now. That means you can mess around with adaptive UI layouts across XR displays using tools like SubspaceModifier and SpatialExternalSurface. Quite a mouthful, right? Jetpack Compose aims to standardize UI design for your phone, tablet, and those snazzy immersive headsets. Cool beans!
Here’s the fun part – hand-tracking in ARCore for Jetpack XR. You get 26 posed joints for gesture-based interactions. I mean, you can do a lot with those digits now, ha! Plus, devs can check out updated samples and benchmarks to integrate this into apps. Handy stuff, quite literally.
Material Design for XR’s also had a glow-up. Google claims it’ll help “large-screen enabled apps” to fit into the new world of XR like a glove. Whether that’s true or just fancy marketing fluff, we’ll see, I guess.
Now, here’s where it gets kinda frustrating. Most devs don’t even have access to official Android XR headsets! Samsung’s got Project Moohan and there’s AR glasses from XREAL Project Aura — both expected later this year. So, Google’s Android XR Emulator’s pretty much a must-have until then.
Here’s a nice twist: the Developer Preview 2’s improved Android XR Emulator now has AMD GPU support. Stability, tighter Android Studio integration, yada yada. Basically, means smoother XR app testing and workflows. Geek-speak for: “your job’s a little less of a headache now.”
Unity’s joining the club too. They’ve got the Pre-Release version 2 of Unity OpenXR, offering performance boosts with things like Dynamic Refresh Rate and SpaceWarp through Shader Graph. Also, their new Mixed Reality template’s got realistic hand mesh occlusion and persistent anchors. Translation: they’re upping the game.
Android XR Samples for Unity are out too. Featuring stuff like hand tracking, plane tracking, face tracking, passthrough — a whole toolkit for devs looking to soup up their Android XR apps. Can’t say they’re not thorough.
Even if Android XR didn’t steal the Google I/O spotlight this year, Google’s not snoozing. They’re dishing out Android XR to more partner devices and prepping Android XR smart glasses. Warby Parker and Gentle Monster are in on it too. So yeah, glasses that might give you Ray-Ban vibes or even come with onboard displays for basics like reading or navigation. Sounds kinda fancy, huh?
Wanna dive deeper? You can get all the nitty-gritty details on the Android XR Developer Preview online. There’s more info on tools and updates for those curious brains out there.