At a hotel in New York, I stood on a terrace on a warm September day, leaning over and petting an adorable pet that wasn’t there. The virtual Peridot buddy jumped around and followed me. It’s much like other virtual pets I’ve tried on phones and even headsets such as the Meta Quest 3. This time, I saw it projected through see-through lenses that auto-dimmed like sunglasses to make sure the virtual experience was clearly seen in bright daylight. All I wore was this pair of chunky but standalone glasses: Snap’s new AR Spectacles. Are these the future of something I might wear someday? If so, how long will it take until that happens?
I saw plenty of viral videos earlier this year of people wearing Apple’s Vision Pro headset outside, for skiing, skating, in the park… wherever. The Vision Pro and Quest 3 headsets aren’t made for everyday outdoor use, but Snap’s AR Spectacles are. Announced at the company’s developer conference in Los Angeles, they’re the latest iteration of Spectacles Snap’s made for years. Snap first created standalone AR glasses back in 2021; I tried those out in my backyard during the pandemic. The new Spectacles are larger, but they’re also more powerful. They have hand gestures like the Vision Pro and Quest 3, and also a whole Snap OS that can run a browser, launch different apps and connect with nearby phones.
These Snap glasses aren’t even made for everyday people. They’re developer hardware that’s being offered on subscription, for $99 a month as part of Snap’s developer kit. That’s a sign that Snap knows the world isn’t ready for AR glasses yet, and neither is consumer technology. Snap is getting a foot in the door a little ahead of competitors.
Watch this: I Wore Snap’s New Self Contained AR Spectacles and Snap OS
Companies like Meta — and likely Google and Apple — are trying to get to smaller AR glasses that can overcome the chunkier limitations of mixed reality VR goggles. I’ve seen attempts to make glasses like these, but they usually have to tether to phones, computers or external processors of some sort to work and keep the glasses small.
Snap’s AR Spectacles put all the processing and battery right on the glasses — nothing else is needed. That means they could be worn easily without any extra cables, but the Spectacles in their current form are chunky and much odder than any glasses I’d ever put on my face. The frames are thick, and the lenses still have rainbow-like patches in the middle where waveguides reflect virtual images projected from side-mounted LCOS miniprojectors.
The battery life is also extremely limited. At around 45 minutes, these are far from anything for all-day use. Although, like the 2021 version of Snap’s AR glasses, these are still designed for developers. According to Snap’s CTO and co-founder Bobby Murphy, these are exploratory devices to see how Snap’s existing AR-enabled lenses on the Snapchat phone app can make a useful leap into glasses form.
This is the exact same approach Snap took back in 2021 but with a boost in processing power. Onboard are twin Qualcomm processors (I wasn’t told which ones) that delivered some pretty crisp, if sometimes stuttery, graphics. The glasses only have a 46-degree viewing angle, which is OK for AR glasses but much less than VR headsets. It felt like I was watching mixed reality through a tall narrow window about the dimensions of a large phone screen.
One big difference between my 2021 demo and now is prescription lens inserts. I had to wear contacts last time, but snap-on lenses similar to what Meta and Apple already offer will work for these Spectacles. Unfortunately, Snap didn’t have my prescription during the demo, so my AR morning was a bit fuzzy. It was good enough to see experiences.
Hand tracking, but no eye tracking
Snap’s Spectacles use external cameras to track the world but add hand tracking like the Apple Vision Pro and Meta Quest headsets. I was able to tap virtual buttons, pet virtual creatures, paint in the air or pinch things at a distance. Snap OS has a dashboard of apps that floats in the air, plus virtual buttons that appear over my hand when I flip it over.Â
The Spectacles do not have eye tracking, so selecting objects takes a little more effort than Apple’s glance-to-select Vision Pro. Phones can connect with the glasses too, offering another way to interact.
An extension for phones
One thing that’s different about Snap’s approach compared to Meta or even Apple right now is it’s interconnected with phones. I tried several demos where I used a nearby phone to control AR experiences on the glasses. I used a phone like a remote to fly an AR helicopter around the room, using on-screen buttons for controls. I also held a phone like a golf club and swung to tee off in an AR golf course that looked like it was half-teleporting into the hotel room I was in.
Snap’s Snapchat phone app will manage the glasses and also allow anyone else to connect and view the AR experiences I’m seeing on the glasses. According to Snap, the longer-term goal is to have phones using Snapchat interact with these glasses.Â
“We definitely think there’s a lot of open space to continue to explore that connection between phones and Spectacles,” Snap’s Murphy says to me.Â
In the short term, the glasses are only made to work with other AR Spectacles wearers, and probably with good reason: Phone OSes don’t play well with AR glasses yet. Both Apple and Google haven’t made moves to make headsets and glasses feel truly integrated, and until they do, other companies are facing a compatibility bottleneck unless they use dedicated phone apps or, like Xreal, build their own custom phone-like hardware. Even then, the connectivity isn’t ideal.
Snap already has a deep range of AR tools on its Snapchat phone app, from world-scanning collaborative experiences to location-specific AR. Snap’s AR chops have improved since 2021, and these new glasses could take advantage of that.
Working with groups
One thing I tried was collaborative painting. I used my fingers to draw in the air while one of Snap’s team drew alongside me, wearing another pair of Spectacles. The glasses can recognize another nearby wearer and share an experience or even collaborate to scan a room with the four onboard cameras into a mesh for mixed reality.
Connecting in the hotel room wasn’t always instant, but that glasses-to-glasses collaboration is a big part of the pitch here. Snap aims to have multiplayer experiences with groups happen in outdoor areas, maybe even museums or art shows, to test out how well these can work for live immersive activations. A Lego brick-building experience was simple, but it showed possibilities if a bunch of people could make a site-specific sculpture together.
These glasses, made to be worn outdoors, have auto-dimming lenses similar to the tech on Magic Leap 2. The Magic Leap 2 has a large external clip-on processor.
Some of the early partners Snap is working with include Niantic — best known for Pokémon Go and a company that’s also been exploring a future of outdoor AR glasses for years. Another partner, ILM Immersive, already made Star Wars and Marvel experiences on VR and mixed reality headsets.Â
Are these a doorway for more AI wearables, and AI AR?
Snap’s also partnering with OpenAI, a move that hardly seems surprising for 2024. This spring, a wave of camera-equipped AI wearables, including Meta’s Ray-Bans and the Humane AI Pin, tried to show promises of “multimodal” generative AI that could access cameras and audio inputs together for assistance or for generative AI creative apps. Mixed reality headsets like the Meta Quest 3 and Apple Vision Pro haven’t tapped into camera-enabled AI yet because camera permissions have been more locked down for developers.
Snap’s opening up its camera access more: OpenAI hook-ins can use the glasses’ cameras and microphones for generative AI lens apps similar to what’s available now on Snapchat’s phone app. (Snap’s also enabling camera access for AR apps, but online access is locked down for those apps, while camera access is open online for OpenAI connections.)
“Where people may want to use external services, we’ll exercise more caution and control and work potentially with developers or third-party service providers to build those kinds of safeguards,” Murphy says. On the Snapchat phone app, there already are generative AI tools that use the camera. We may finally see some of these make the move to glasses now, too.
I tried generating some 3D emoji using Snap’s generative AI, as well as navigating an educational solar system lens app that took my requests using my voice. I found responses during my demos were laggy, and the glasses didn’t always understand what I was saying (although that was potentially the connection in the hotel room, or something else). Snap’s opening up of camera access to AI-infused AR feels a step ahead of where Meta and Apple are currently. That could change soon, though: Meta is imminently expected to showcase new AI and AR updates, while Apple could introduce Apple Intelligence features on Vision Pro next year.