Friday, November 22, 2024

Meta Orion AR Glasses Developer Hands-On: Viewing The Future

Must read

Alex Coulombe leads the NYC-based XR creative studio, Agile Lens: Immersive Design. On Wednesday, after Meta’s keynote address, UploadVR’s Ian Hamilton was invited by Coulombe to observe his scheduled Orion demo.

While we would have preferred to bring you first-hand impressions directly from UploadVR, and we’ve been told by Meta they may do future demos in New York, until that happens we’ve asked Coulombe to write for us from his expert perspective about his time in the glasses.


September 25 was a good day at Meta Connect.

Harkening back to the idyllic days of Oculus Connect, I had hallway run-ins with countless brilliant XR people, some of whom I’ve known for many years and others I’d only met in the metaverse before today. The keynote was thrilling, packed with live, real-time demos including a stumble that Mark Zuckerberg played off with humility. Mark Rabkin showed a whole slide about Unreal Engine and announced camera access. Andrew Bosworth apologized for the fractured dev ecosystem and promised to make it much easier for us all to make great stuff for HorizonOS in the future.

There was also mention of a 10-year AR glasses project called Orion. The moment the keynote was over, I was whisked into a room with a large QR code on a table, a bunch of kitchen ingredients, and my very own Orion glasses to try for myself.

How did I get here?

A few short weeks ago, I was offered a late invite to Meta Connect and didn’t know if it would make sense to go. I already had my travel plans figured out for Unreal Fest Seattle, where my company was giving several talks, followed by a Star Wars convention in Orlando, where we’d be exhibiting our open source recreation of the Galactic Starcruiser. Did I really want to make room for Meta Connect as a regular attendee? I expressed my impasse on social media to solicit the public’s thoughts on if this trip could possibly be worth the added logistical hoops.

Andrew Bosworth, Meta’s CTO, chimed in: 

“We will have cake, I think.”

I like cake. And cake, it turned out, meant an Orion glasses demo. The demo I experienced was worth the trip. Full stop. 

I came out of Orion buzzing with the same level of excitement I had after trying the Tuscany demo on my Oculus Rift DK1 back in 2013. And just like with DK1, I wasn’t only feeling the excitement of the direct experience I’d just had, I was feeling the potential of a new type of product that’s only in its earliest days. As a developer, I can’t wait to build for this platform. As a consumer, I can’t wait to see what everyone else builds. Also: this felt both like echoes of every XR device I’d ever tried, and something wholly and wonderfully new where the whole was indeed greater than the sum of its parts.

A couple weeks before receiving my Connect invite, I was ruminating on the increasing number of people I knew simply would never use a headset that wasn’t extremely light, comfortable, environmentally responsive, graphically impressive, and wireless. The current Quest roadmap seemed hopeless for them. 

I imagined a future where the device they wanted existed and handled much of its compute, not unlike Air Link, with a wireless connection to a headless PC that ‘just worked’ at a level my grandmother could use. No SteamVR, no Oculus runtime, no complicated driver updates. A lot of people chimed in to say what I imagined was a pipe dream. And yet, here we are less than a month later, and I’ve now tried the first version of the device I was imagining. One that just may have the power to convert the never-wearers. We’re coming for you, Scott Galloway.

As I’ve explored the market over the last decade I’ve experienced AR headsets, hand tracking, eye tracking, gesture detection, haptic armbands, wireless XR streaming, spatial persistence, conversational AI, video calls with avatars, and multiplayer games. What I hadn’t experienced before today was all of that rolled into a glasses form factor with a killer field of view, low latency, and high framerate that was also comfortable, lightweight, and cool to the touch.

Below I’ve ordered some notes across my roughly 30-minute demo in chronological order. Thomas Van Bouwel, the creator of the upcoming game Laser Dance, took video on my phone of the entire experience, though of course you can’t see what I see. Meta’s Joshua To guided me through my future “day in the life” of wearing these both for casual and more serious use.

I tried on several Orion glasses. I’m not sure how it was determined which one I should use, but I believe there was more to it than simply how well it fit. The eye tracking setup was familiar. I looked at dots. I had to do this a couple times because Meta said I needed to hold my head very still.

The wristband was a little, but not much, tighter than how I would wear my watch. I wasn’t not comfortable, and I was absolutely delighted the moment I realized it was vibrating anytime a gesture was recognized. The band recognized a hand gesture that’s much like flipping a coin, and it felt really good.

The wristband left a lasting impression. Photo by Ian Hamilton of Alex’s arm a few minutes after the demo.

I never actually touched the little wireless puck that handles most of the compute and then broadcasts it wirelessly to Orion. I assumed I’d be putting it in my pocket but no, it stayed out in the middle of the room for most of my experience and I never interacted with it. So it maintains quite the ‘Air Link’ connection. I wonder if it got hot, which would have made it uncomfortable to keep in my pocket? I was told its battery, as well as that of the wristband, could last all day, while the glasses themselves were closer to two hours.

Once I got to the home menu of Orion I immediately thought of Magic Leap and HoloLens and XREAL and Tilt Five rather than Meta Quest or even the Apple Vision Pro. Why? Because we’re not seeing through cameras, aka “passthrough”. This is the real world, seen clearly with zero lag or distortion. And the digital content had luminosity to it and an excellent field of view that feathers out as it reaches its limits.

Like Apple Vision Pro, the primary modality of interaction is to look where you want as a kind of mouse cursor and then pinch gesture to select, as if you’re clicking a mouse. Unlike Apple Vision Pro, because of the wristband, I did this behind my back. That felt like a superpower.

The starting menu reminded me of the Apple Vision Pro home menu – a collection of simply arranged icons – but like a 1980s remaster of them with fewer colors and pixels. Like most things that remind me of the ‘80s, I found it charming.

I couldn’t resize any window but I could move the window around. The window would occasionally end up facing down – an apparent glitch – and I was surprised there wasn’t something that forced the windows to point toward me. (I’ve written LookAt functions in C# and Blueprints so many times I could do them in my sleep).

When I added a new window I liked how it seamlessly slotted in between the windows I had open and pushed them to the side.

I dictated a chat to “Chris Bacon” over Messenger, then he called me. The IRL video looked somewhat low resolution but had good framerate and he was moving the camera a lot.

I watched a Matrix game demo on YouTube which looked good though I became very aware of the age-old problem with proper AR headsets — you can’t render black and anything that should be black becomes alpha=0, or invisible for the non-developers out there. I don’t believe I’ve seen this solved outside of using passthrough cameras which give more control over how an environment is displayed and overlaid

I was asked to use a double thumb tap (you do have to do it quickly to register) to generate a Meta AI image of a group of people in a theater wearing AR glasses, an image I’ve actually conjured in many formats before though not always using AI. As someone a little fatigued by Vision Pro’s reliance on a pointer finger to thumb pinch, I appreciated there was more of a variety of gestures in use here: double thumb tap for Meta AI but also middle finger to palm for bringing up the home menu and the coin flip gesture for scrolling.

Meta’s representatives then directed me to use Meta AI by standing up and walking over to an example kitchen scenario, and then commanding Orion to “Give me a smoothie recipe”. I first made the mistake of saying “Can you give me a smoothie recipe?” which produced more of a traditional search. It then looked at my ingredients, labeled them with stably anchored text, and produced a recipe with pages I could follow. I did the recipe twice because, the first time, a pineapple on the table wasn’t identified. Still, I was given a very tasty looking recipe that included the other ingredients on the table that were correctly recognized. We changed it up with new ingredients the second time and all were correctly identified, labeled, and spatially anchored. I wanted to try moving the ingredients around but that would require object tracking, and that wasn’t part of the demo here.

Meta’s Prototype AR Glasses Have Remarkable Field Of View

Meta showed off prototype true AR glasses at Connect, codenamed Orion, with a remarkable 70-degree field of view.

I’ve cooked while wearing Meta Quest 3 exactly once. I’ve cooked while wearing Vision Pro exactly three times. Those devices are problematic around the kitchen for a litany of reasons, and Orion is the first device of its kind I can actually imagine using comfortably throughout a cooking process.

When I moved my head around I didn’t feel like the image was blurring or creating much ghosting. While scrolling Instagram and liking photos I seamlessly read comments the entire time. I also was able to use either the coin flip gesture to flick through images or pinch and swipe to scroll in a hand-tracking focused manner.

I next went through another Messengers demo with a flat Codec Avatar I recognized as Jason Saragih but who identified themselves as Josh. Meta later clarified this was Josh controlling Jason’s avatar and this kind of switching is standard for Meta’s demonstrations. I did get to experience talking to a Codec Avatar in VR at SIGGRAPH in 2022 and loved it, particularly how spatial it felt. I enjoy having “spacetime” calls with the Vision Pro and Personas and are using them more and more. I imagine the Orion Codec Avatar demo wasn’t spatial because it’s just too heavy to render, but it did look very good in a flat format and passed the uncanny valley test. I asked him to puff his cheeks (barely anything) and to stick his tongue out (that doesn’t work yet), so Personas are still ahead on that front for whatever that might be worth to you. As someone trying to push the limits of human performance fidelity for live virtual theater, it’s worth a lot to me. I hope to see this space race (arms race? face race?) continue at full throttle.

The entire time I was using the wristband I kept thinking ‘this feels like what I always wanted the 2015 Myo armband’ to be. This feeling reached its peak when I went into playing a space game appropriately dubbed “Stargazer” that felt just like one I had made with the Myo, embedded below for illustration. I basically played the same experience on the glasses, minus the hammer. 

Many Apple Vision Pro apps have depth but sit in a frame, and on Orion this game was ‘windowed’ in a similar way. Unlike Apple Vision Pro, I could ‘‘step into the game” and feel quite immersed by it. I didn’t even encounter one of those pesky clipping planes that drove me crazy when developing for Magic Leap.

It was after this game, trying to launch the next app from the home menu, where the experience froze. Because my crashed content was locked in space and didn’t follow my gaze, there was zero discomfort and I never felt a need to take the glasses off. I was then handed a second pair of glasses. We didn’t set up eye tracking again so I now controlled my cursor entirely via hand tracking. It was fine and I appreciated the opportunity to explore this other input contingency.

The next game was multiplayer and was called PONG. I love how much of this demo had a throwback quality to it and it was both fun and intuitive to use my hand to guide my paddle. I appreciated the fact that as it took place in a big cube, and I was seeing the ball not from an overhead view, as in original Pong, but from the ‘paddle’ angle. As such, this was a game that could only be played properly in a fully spatialized context. Like ping pong or tennis though, I wanted more topspin!

And that was it. Besides topspin, did I have any complaints? 

AR Glasses In The Future

Sure, the resolution could be higher. Maybe it would be nice to remove all remnants of chromatic aberration, though this was really only visible at the edges. Maybe there could be more active object tracking and a litany of other features seen in other XR devices. I did no typing or touching of virtual objects, for example.

But none of that bothered me. Especially for a dev kit, I’ll gladly take, for example, a low resolution that’s hitting more than 72 frames per second with low latency over a much less stable higher resolution variant. As a theatremaker with lots of practice crafting XR demos, I appreciated how well these 30 minutes walked through the vision of what Orion is now and where it’s headed. I know all of this will improve in time, and I’ll be patient.

Well, let me be more specific. I’ll be patient for the consumer release. I’m champing at the bit to have one of these as a dev kit, or as a device we can have dev days with. Or even a simulator, which is exactly how I was able to get so excited about, and hit the ground running with, both Magic Leap and Apple Vision Pro. Grant programs wouldn’t hurt either!

Please Meta, if you truly want to make the lives of developers easier, give us developer kits and simulators early and often. In return, we offer to produce stellar content for your ecosystem, much more of which will be ready for your consumer launch dates so no one feels like an hour after unboxing they’ve seen everything.

Oh, and speaking of Alien Vs Predator, you might have noticed I’m making a lot of comparisons to the Apple Vision Pro, a device that will almost certainly always cost more across all its generations than whatever Meta ends up charging for Orion. And yet, these two companies are clearly already engaged in a glorious battle, aping each other’s best features then throwing down a gauntlet with a new feature that challenges the other to rise up to either ape them or surpass them. This is healthy. This is good. It is likely stressful for employees of both Meta and Apple, but they are well-compensated and the results are a win for XR devs and consumers alike.

Meta has been in sore need of this level of competition for a long time. In 2018, I fondly remember receiving a dev kit from Google that essentially turned the Lenovo Mirage Solo into an Oculus Quest 1 over six months before the actual release of the Oculus Quest 1. It was wonderful to play with, not only as a dev, but as someone looking forward to a war of one-upmanship between Google and Facebook. Unfortunately, Google killed Daydream and never released a consumer version of that dev kit just as Facebook doubled down on standalone and began readying for the transition to Meta. So Meta entered the consumer standalone 6-dof headset market largely uncontested. The lack of real competition made them complacent and slower to innovate than to the liking of those of us trying to survive the VR winter.

Now Apple, along with a flurry of XR activity from other companies including, rising from the dead, Google, is forcing a reckoning for Meta. Will another company claim their crown for go-to consumer XR / spatial computing ? It’s a much needed, and hopefully neverending, race.

I can’t wait to watch that race. 

The cake wasn’t a lie, then, and in a few years we’re going to get to eat it too.


Special thanks to Andrew Bosworth, Stephanie Young, Joshua To, Colman Bryant, and Thomas Van Bouwel. Coulombe wants you to go see A Christmas Carol VR, coming to Meta Quest (and maybe Apple Vision Pro) this December.

Latest article