🚀 Add to Chrome – It’s Free - YouTube Summarizer
Category: Technology Review
Tags: AR GlassesAugmented RealityMetaOrionWearable Technology
Entities: InstagramMark ZuckerbergMetaMeta AIMeta Reality LabsOrionQuest
00:00
Wow. So, I just got back from trying Orion, which are Meta's first true AR glasses, and let me tell you, they're good.
I mean, they are really, really good. That's an understatement.
So, I
00:15
hope you are ready to dive deeper down this rabbit hole with me and explore these next level wearables together. It's going to be amazing.
Trust me. Ever since I saw Meta teasing they were building AR glasses in 2019, all the way up to Mark Zuckerberg recently revealing
00:32
an actual pair on stage, I've been closely following the company's progress in making it a reality, or should I say augmented reality. Jokes aside, this week I finally had the opportunity to see it for myself at Meta Reality Labs,
00:48
and man, they did not disappoint. Enough hype, let's get into it, shall we?
For context, Meta's Orion glasses are prototypes which are not for sale. Even if you could purchase a pair, they would cost you a small fortune and be
01:04
extremely difficult to produce at scale. Right now, the goal is to simplify the technology and eventually bring the price down to that of a laptop or high-end smartphone.
Meta is using Orion as a benchmark, a proof of concept for what's possible, and they'll keep
01:20
refining it until it's ready for early adopters. It's still very much a deaf kid, but it gives us a clear look at where Meta is headed.
Honestly, other tech giants should be paying close attention. The company is on a roll, and I don't think it will be long before we
01:35
see some of Orion's components appear in other meta products available for consumers. Anyways, these AR glasses can do loads of fun stuff such as uh watching videos on a giant virtual screen, chat with friends who appear as
01:50
avatars, play holographic games, or see helpful info pop over objects you're looking at with the help of AI. It has features that you and I would want for everyday wear.
And I was lucky enough to try some of those features for real. So,
02:06
while I'm making my way to Meta Reality Labs, let me give you a short breakdown of the full Orion set. First off, a few cool things to point out about the glasses is that they have approximately a 70° field of view enabled through the use of microLEDDs and swapping glass
02:22
optics for silicon carbide. It has a custom eyetracking system, inbuilt speakers, and batteries.
The frame is made of magnesium alloy material that is known to be durable, thermalproof, and lightweight. Plus, on the outside, it has seven cameras to keep everything tracked in your space.
What stands out
02:39
the most is that Orion uses tiny projectors hidden in the arms that beam light into special lens layers called waveguides. These lenses have super small 3D structures etched into them that bend light to create holograms you can see in the real world.
All of this
02:56
is controlled by custommade chips and sensors designed by Meta. Next to the glasses, there's Orion's neural wristband, codenamed series, a smartwatch style device that reads tiny electrical signals from your wrist muscles using EMG, which stands for
03:12
electromyiography. This let you control the glasses with subtle gestures like pinching your fingers.
It features a processor that interprets signals, translating them into controls, plus a haptic engine that provides tactile feedback. It has EMG
03:28
sensors spread out through the band that captures each signal and a soft elastic material with a magnet that automatically snaps in place. And last, but definitely not least, there's the wireless compute puck, the heart of the Orion bundle.
It's roughly the size of a
03:45
modest battery pack and compact enough to clip on or slip into your pockets. The Orion Puck handles all the heavy lifting.
We're talking rendering visuals, processing AI tasks, spatial tracking, and much more. It communicates with the glasses over a local high-speed
04:01
wireless connection and has to stay in range of about 12 ft of the wearable for optimal performance. What's even more interesting is that it seems like it can be utilized as a controller as well, but I'll get to that soon.
The Compute Puck appears to be capable of running all
04:18
day, but the glasses themselves only last around two to three hours on a charge, and that's one of the biggest challenges, balancing what today's battery tech can actually support while at the same time offering the kinds of features users expect. So, I got to play
04:33
around with Orion for a solid 30 minutes. And having pretty much tried everything on the market announced and well unannounced, I honestly didn't expect to be as impressed as I was.
Actually, wearing them was quite comfy. They weigh about 100 g, which is
04:50
slightly heavier than a typical pair of glasses, but not noticeably so once on your head. It's pretty crazy how we're getting closer to that sweet spot Michael Abrash talked about many years ago.
The reason is that in my opinion, in order to be successful, AR glasses
05:05
have to be socially acceptable, weigh no more than about 70 g, and dissipate no more than roughly 500 m on your head. I don't remember them getting hot either.
Instead of fans, they passively dissipate heat through its magnesium frame and efficient internal components.
05:21
Of course, when you look at the pictures, you can clearly see the frames are quite thick, but for what these prototypes are capable of, I think they're already impressively small. And as the tech matures, they'll only get slimmer.
Now, visually, Orion blew me
05:36
away. The 70° field of view didn't feel limiting at all.
If anything, it was wide enough to make content feel present and immersive. As someone who started out with devices like the Hollow Lens and Magic Leap, I really appreciate how each jump and field of view makes a
05:52
meaningful difference. This is the best field of view I've experienced in AR glasses so far.
It felt fresh and it was a reminder that we're almost at an acceptable level for consumers. Don't get me wrong, you're still looking through a window inside the glasses, but
06:09
unlike other wearables, you don't have to lean back extremely far to fit things in frame. The image fidelity, on the other hand, did the job, but it could definitely be better.
The holographic windows and objects had a slight translucent ghostlike quality which gave
06:25
the visuals a somewhat soft washed out appearance. The image was technically clear enough to read and glance through content, but I still had to work a little to focus on it.
I think it's a good starting point and I'm looking forward to seeing how this will be refined in the future. Now, the first
06:42
thing I got to do with these glasses was open Instagram and scroll through its feed. You opened it by a thumb to middle finger pinch.
And by simply looking at a video and doing a quick thumb flick, I could swipe to the next piece of content. To like a video, I just looked
06:58
at it and harded it with a pinch gesture. Your eyes act like a cursor and your hands become a click and scroll.
Once you're used to it, it feels really natural. The subtle haptic feedback of the neural wristband is the final puzzle piece here.
it instantly confirming each
07:15
gesture is so satisfying. Overall, it seems like a great input device.
Unlike traditional hand tracking, you don't have to worry about keeping your hands in view of the cameras when you confirm certain things. It simply reads your muscle signals and translates them into
07:31
controls. That alone feels like a next level leap.
I'd say it was about 95% accurate. not flawless, but despite that, it was still one of the most intuitive input devices I've used in a while.
In general, I think this combined with hand and eye tracking has loads of
07:48
potential, and I can't wait to see where it will take us. Maybe turning the neural wristband into a full-blown smartwatch could be the next step.
That said, it was possible for me to grab the entire Instagram app and move it to my left or right. In this case, it did use
08:05
traditional hand tracking, and I found it responsive enough for casual use. I put the spatial tracking to the test, and the window stayed smoothly anchored as I moved around it, something that shouldn't be taken for granted, especially with a form factor like this.
08:21
They then proceeded to demonstrating a more complex multi-wind setup. On my left, I had a navigator.
In the middle, a video call. And on my right, I now had Meta's Messenger.
Performance-wise, it did all of this without dropping frames
08:37
or suffering from serious latency. So, that alone was impressive to see.
The quality of the call was surprisingly good, and I could turn up the audio to a decent level. And the Meta employee on the other end seemed to have no problem hearing me.
I could see them, but I
08:52
don't think they could see me in some kind of avatar form. It does make you wonder how will they have people create one in the future.
1.21 cut. Could it be that you just hold up the
09:08
glasses right in front of you and then do a quick face scan by moving them around? We don't know, but I guess time will tell.
Those Kodak avatars we've seen Meta work on could be a great addition to the glasses, allowing you to use them not only for video calls, but
09:25
for social apps, gaming, collaboration, and productivity across augmented environments. After all that, they wanted to show me Meta AI.
I got to stand in front of a table full of smoothie ingredients, things like oats, bananas, and seeds. And by simply gazing
09:42
at the table and using a quick voice command, Meti recognized the items and overlaid floating labels above each one. Almost instantly, it generated a step-by-step smoothie recipe which appeared right in front of me.
Boom.
09:57
Besides that, I wanted to challenge it by doing a bunch of unexpected things because I had already seen them demo the recipe table dozens of times. Anyways, there was a kitchen in the room and as I walked up to the sink, I noticed it had a logo and I got curious about the
10:14
sink's brand. Uh, super random.
Yeah. At first, it wasn't able to give me an answer.
It took a few attempts to recognize what I was looking at, but once I moved closer to the sink, it seemed to work better. Besides that, I let Meta AI describe the room I was in,
10:30
which it handled just fine. and I look down at my shoes having it raid them.
In classic AI fashion, it gave me a nice little boost in self-confidence. It is incredible how naturally Meta's AI fits into the experience.
No menus, no setup,
10:48
just look and ask. AI and small farm glasses are a match made in heaven.
And what you just saw here is only the beginning. One of the last things I got to try was a multiplayer demo, an interactive game of Pong that showcased
11:03
Meta's colllocation capabilities. There was a QR code placed on the table, and by scanning it, a holographic version of the game appeared right in front of me.
The other player did the same, and because the QR code served as a shared anchor point, the game space was
11:20
perfectly synced for both of us. I couldn't help but think about that scene of space chess in Star Wars.
But at least I wasn't playing against the Wookie. As I briefly teased at the start of this video, the puck also seems capable of acting as a six dove motion
11:36
controller. While still experimental, Meta seems to have done multiple tests and saw they even imagined the puck as a floating holographic display during a video call.
How much more Star Wars does it get? So beyond the game, the collocation tag stood out as one of the
11:53
most promising elements. Meta's been exploring this with Quest for years, and what I experienced felt like a real step forward, closer to something that could become easy to use and also socially acceptable.
My 30 minutes with Orion
12:08
made me realize how natural eye contact feels when interacting with someone who doesn't wear those glasses. you can just have a normal conversation without attack being a barrier.
So, to wrap things up, what I got to experience here is nothing short of a breakthrough. The
12:26
people behind this have some real magic on their hands, and to date, it's easily the most impressive thing I've experienced in the tech space. Seeing Orion tie into Meta's broader ecosystem, we're talking WhatsApp, Instagram, and MetaI, could be a real gamecher for the
12:43
company. For me personally, Orion is another reminder that AR glasses aren't a faroff dream anymore.
They're right around the corner. Any tech giant working on their own wearables should be paying close attention to what Meta Reality Labs is working on here because
13:00
this might just be the blueprint for how the public wants to wear and use AR. If you want to learn more about Orion, I'll make sure to leave some links in the description that tell you more about the research Reality Labs is doing.
And yes, I want to thank Meta so much for the
13:16
opportunity to try their first ever AR glasses. Very, very cool.
Roads, where we are going, we don't need roads. Until next time and bye-bye for now.
See you in the future. Oh yeah.