Hands-on with the Meta Ray-Ban Display glasses

Mark Zuckerberg, CEO of Meta Platform Inc., is wearing a pair of Meta Ray-Ban screen AI glasses during the Meta Connect event in Menlo Park, California on Wednesday, September 17, 2025.
David Paul Morris | Bloomberg | Getty Images
When it comes to the new $ 799 Meta Ray-Ban screen glasses, the device really dazzling the cloudy, gray bracelet.
I was able to try the new generation of smart glasses announced by Meta’s social media company at the annual Connect event on Wednesday. These are the first glasses that Meta sells to consumers with a built -in screen, and CEO Mark Zuckerberg’s headphones and glasses are an important step for the company because it works towards the vision of passing smart phones as a way of information processing.
Nevertheless, the screen in the new glasses is still quite simple. Last year, Meta introduced Orion glasses, a prototype that can cover the physical 3D visuals to the physical world. These glasses were thick, required a information process disk and was built only for demo purposes.
However, the Meta Ray-Ban screen will be sold to the public starting from the US on September 30th.
Although new glasses only contain a small digital display in the right lenses, this screen makes unique visual functions, such as reading messages, seeing photo previews and reading live subtitles while chatting with someone.
The control of the device requires to insert the EMG sensor bracelet that detects the electrical signals produced by a person’s body, thus controlling the glasses through hand gestures. Putting this was like a circle at an hour, except for the small, electric tremor I felt when it was activated. It wasn’t as shock as you’ve removed the clothes from the dryer, but it was noticed.
Wearing new glasses was less shocking until I took them and saw the small screen just below my right cheek. The screen is like a miniature smartphone screen, but it is translucent to avoid hiding real world objects.
Although it was a high -resolution display, it was not always clear when the symbols contrast with my real world view, and the letters appeared slightly blurry. These visuals do not mean embracing your head with loyalty in crystal clarity, but you have to activate the glasses’ camera and take simple actions such as looking at the songs in Spotify. More benefits than fun.
Meta Ray-Ban exhibits AI glasses on Tuesday, September 16, 2025 in Menlo Park, California, Meta Meta center in the United States.
David Paul Morris | Bloomberg | Getty Images
I had the most fun trying to walk on the screen and make hand gestures to browse applications. By squeezing my fist and shifting my thumb on the surface of my pointer, I was able to scroll in applications as if using a touch surface.
At first, I made a few experiments to open the camera application by bringing my Endex finger and thumb together, and when the application was not activated, I would squeeze myself twice and mimic the double -clicking of a mouse on a computer. But for me to use a mouse, I have learned that I have subpar squeezing skills that do not have the right cadence and timing to open the application continuously.
It was a bit strange and fun to see the people in front of me while we were constantly squeezing my fingers to interact with the screen. I felt I was reviving infamous comedy scene “I’m crushing your head, crushing your head!”
When the camera application finally turned on, the screen showed what I was looking at and gave me a preview of how my photos and videos would appear. As with a TV, it was like having the feature of the picture in his personal picture.
From time to time I found myself in some cognitive mismatch, because my eyes always understand what to focus on because of the screen sitting just outside the center of my field of view. If you have done a vision test that includes determining that the Squigly lines appear around you, you have an understanding of what I feel.
In addition to compression, Meta-Banan screen glasses can be controlled by using the Meta AI sound assistant, just like the users of the device.
When I took a picture of some pictures that decorated the halls of the demo room, he asked Meta AI to explain what I was looking at. Probably, Meta Ai would tell me that I was looking at various paintings from the Bauhaus art movement, but it was never correctly activated before being accompanied by another part of the digital assistant demo.
I could see that the live subtitles of the Meta Ray-Ban screen is useful in noisy situations, because in the background, while making dance music from the Connect event, he successfully took the sound of the demo tour guide. When I said, “Let’s go to the next room,” I saw that his words looked like a closed captions on a TV show on the screen.
But in the end, I retreated to the wristband, especially when I listened to some music with glasses through Spotify. By turning my thumb and index finger as if I was turning an invisible stereo button,
I was able to adjust the volume, an expected experience.
The new Meta Ray-Ban screen glasses were how much the latest technology was clogged in this neural bracelet that really drilled my brain. And the high price of the device is new enough to attract developers looking for more information processing platform to create glasses, applications, while closing consumers.
WRISTWATCH: Meta’s head product officer, the next important wearable technology glasses will be.



