The first moment the new Meta Ray-Ban Display glasses impressed me was when, shortly after learning the hand gesture to change the volume on the audio, I caught myself turning it down without a second thought.
The selling point of the glasses is that they’re the first in the partnership between Meta and Ray-Ban to include an internal display digitally overlaid on the wearer’s field of view, bringing the companies a step closer to their dream of full augmented-reality glasses. But how the wearer interacts with that display feels like the biggest innovation.
The glasses pair with a wristband that measures electrical signals in the wearer’s wrist via a technique called electromyography to translate hand gestures into actions. (Meta calls it the Neural Band.) You tap your thumb and index finger together to select an option, for instance. Or, to raise or lower the audio volume, pinch those fingers together and twist your wrist as if adjusting a dial. The motion felt intuitive enough that, after being shown how to do it, I didn’t need to think about it to perform it again later.
“A lot of these gestures, we’re trying to do things that are relatable, like turning a dial,” Kenan Jia, device product manager for Meta Ray-Ban Display, said at a preview for press earlier this month, during which I was able to test the glasses. “So how do we use these things that you’re actually accustomed to already, these hand movements that you do in the physical world, in this kind of new form factor, in this new interface?”
During the 30-minute demo, the band worked well. There were a few times it didn’t respond correctly, but it was difficult to tell if that was because I didn’t perform the gesture properly. Ideally the band would be able to correct for minor mistakes. Still, the gestures were easy to pick up, and it wasn’t hard to imagine getting used to controlling a device that way.
That’s certainly Meta’s hope. The company officially unveiled the new Meta Ray-Ban Display at its annual Connect conference on Wednesday, after the product had been widely rumoured for weeks. The display that gives them their name appears just off centre in the right lens, and is created by a tiny projector on the interior of the frame casting light onto the lens that’s redirected to the wearer’s eye. It’s small enough that a user can still see most of the surrounding world, and it can be toggled on and off. Meta also made sure it’s not easily visible on the outside of the lens, allowing you to use it privately.
The companies are treating the glasses as not just the next generation in their existing line of smart glasses but an entirely new product while they try to solidify their lead in the race to capture what’s shaping up to be a lucrative market for smart glasses.
In the past decade, tech companies such as Google and Snap struggled to turn the devices into mainstream consumer products, largely because the results were ungainly and looked more like devices than regular glasses as they packed in technology at the expense of style. Meta and Ray-Ban addressed the problem with their first smart glasses, introduced in late 2023, sacrificing capabilities like augmented reality to maintain the appearance of standard sunglasses, except for inconspicuous cameras on either end of the frame.
The next generation, which had a better camera and audio and added in AI features such as image recognition, became a surprise success and kicked off a rush by competitors such as Apple and Google — in partnership with eyewear brands like Warby Parker and Gentle Monster — to develop their own. Some of those glasses are expected to hit the market next year, and could now include displays themselves. Amazon, which had already begun developing glasses with a display for its drivers, is reportedly planning a consumer version.
For wearers of the Meta Ray-Ban glasses, the display for the first time lets them do things like read text messages, see a viewfinder showing them what the picture they’re capturing with their glasses will look like or watch video. It also makes use of the AI that Meta and Ray-Ban have enabled in the glasses, providing capabilities such as turn-by-turn navigation, live transcription during a conversation and displaying information about objects that the user sees. If the wearer wants to identify the style of a painting, for instance, they can ask Meta’s AI and the glasses will snap a picture, analyse it and provide an answer.
In terms of size and appearance, the Meta Ray-Ban Display is larger and bulkier than the companies’ previous AI glasses, but not overly so. It may not suit all face shapes — it looked good on others but not so much on me — but it still reads as a pair of glasses rather than a computer. Still, it shows the challenge ahead as companies work to integrate more technology into frames.
“One thing that we always want to be sure, as soon as we start a new product, is that we’re always thinking of the product as a [pair of] glasses first, so we are not willing or to accept any kind of compromise on the design, but also on the comfort,” said Francesco Bet, product manager at Ray-Ban owner EssilorLuxottica, at the preview.
Keeping the weight down is a top concern, for example. To that end, the display glasses opted for lighter elements like titanium hinges that aren’t present in the rest of the portfolio, according to Bet.
Meta and Ray-Ban don’t expect everyone to rush out and buy a pair. The aim is to target different customer segments with different products. At Connect, the companies also announced the next generation of their smart glasses without a display, with a better camera and battery life but still for more casual consumers, as well as the new Oakley Meta Vanguard, a visor-style from Oakley — Ray-Ban’s stablemate under EssilorLuxottica — designed for athletes.
The main audience for the first generation of the display glasses is early adopters. To start, they’ll be released in limited quantities, with the first launch taking place in the US on September 30 before they become available in the EU and UK early next year. They come in two sizes and two colours, sand and black, and at $799, they won’t come cheap. Over time, Meta and Ray-Ban plan to scale up production of their display glasses and introduce more styles, as they did with their previous glasses.
This suite of products is giving Meta and Ray-Ban a lead in the field, but competition is expected to pick up, particularly given how much is at stake. In a note to clients on Tuesday, HSBC analysts estimated that the market for smart glasses will reach about $6 billion in 2026, rise to $36 billion by 2030 and hit $151 billion in 2040 according to their base case, which “assumes an adoption pattern in between smartphones and smartwatches in 10 years’ time.” In their most optimistic scenario, the market could soar to $241 billion in 2040.
That would be a dream for the tech giants that have long tried to make smart glasses a must-have product — and remains far from guaranteed. But as the technology improves, the components shrink and the glasses do more, while just looking like glasses, they’re giving shoppers more incentive to pick up a pair.