Preview: Meta’s Orion AR Glasses Could Be the Future of Computing
This is really cool tech, and seemingly far ahead of anything else in development.
Meta’s recent Connect event was packed with announcements — a new VR headset, new AI tools, and updates to their smart glasses — but it led up to a product they won’t even be selling: Orion. That, to my understanding, was not the plan. Zuckerberg had hoped the first generation of these glasses would launch at this event, after a decade of development, but production difficulties meant they weren’t possible to scale, and certainly not at an acceptable price.
But they’re just too cool not to show off, and I would expect a consumer version to be announced at Connect 2026. From my understanding, they are far ahead of anything being developed by competitors.
So, what are they? They are a pair of smart glasses, but unlike their Ray-Ban collaboration, Meta Stories, these have displays and a far more complex computing system. Orion is the utopian halfway point between their screenless Meta Ray-Ban Stories smart glasses and their Quest VR headsets. They are the size of glasses and only weigh 98 grams, but digital information is projected onto their lenses, letting you see video content, messages, and other information floating in the air around you.
To be clear, you’re not looking at a screen. “Mixed reality” modes have been on VR headsets for a while, where the headset records the world outside and plays a live feed to internal screens with added digital elements. Orion does the opposite. You are looking through glass lenses, as with traditional glasses, but tiny projectors around the rim of the lenses project digital elements into your field of vision.
This is really cool tech, and seemingly far ahead of anything else in development. It is hard to know for sure — after all, Meta is showcasing a prototype they can’t sell, and most tech companies don’t do that — but the consensus is that this is ahead of anything else being worked on.
Then, there’s how you interact with them. On the Meta Stories, you can press a button or use voice commands, but for Orion, that’s not enough. You need to move objects, click on windows, zoom in, and perform all the usual functions you’d want from a desktop display, but without a keyboard or mouse. While Orion does have voice and hand-gesture control, similar to a VR headset, it also comes with a neural interface in the form of a strap worn around your wrist.
To clarify, this doesn’t mean it can read your mind. Instead, it picks up nerve signals that command muscle movements — for example, swiping or clicking motions — allowing it to recognize even very small gestures. This intuitive solution could easily become the standard if the sensors are eventually miniaturized to fit into something like an Oura Ring. Another intuitive decision is offloading most of the computational work to a puck that fits in your pocket. I’m somewhat surprised they haven’t outsourced it to your phone, but there is likely a technical reason.
Once again, though, these aren’t consumer products. Meta has built about 1,000 development units. Before going to market, Meta will need to make them lighter, cheaper, and increase battery life, which currently sits at around two hours. That’s impressive for something this complex, but if it’s to be a smartphone replacement, it would need all-day usability. Even in its current form, however, Orion is the closest anyone has come to developing the next generation of computing, and I’m excited to try one of their demo units soon.