October 7, 2025
ray-ban-display-hero-1200x675.jpg


There is one thing that people want to know when they see my intelligent first generation Ray-Ban glasses, and that has nothing to do with AI, cameras, or sound surprisingly big ear that they have turned off. They want to know what is probably in mind right now while you read this: do they have a screen in it? The answer? Unfortunately, no … so far.

In Meta Connect 2025, Meta finally unveiled her smart ray-ban display glasses which, as you may have gathered from the name, have a screen in it. It doesn’t look much like the surface – we have screens everywhere, all the time. Too many, in fact. But I am here to tell you that after having used them before the unveiling, I regret informing you that you will most likely want another screen in your life, whether you know it or not. But first, you probably want to know exactly what’s going on in this screen I’m talking about.

The answer? Applications, of course. The display, which is actually in color and not monochrome like the previous suggested reports, acts as a head display (HUD) for things like notifications, navigation and even photos and videos. For complete specifications of this display, you can read the information companion with my hands here. For the moment, however, I want to focus on what this screen looks like. The answer? A little shocking at first.

Ray Ban Lunes in mind
© James but / Gizmodo

While the Ray-Ban display, which weighs 69 g (about 10 grams more than the first generation glasses without screen) does its best so as not to push a screen in front of your face, it is always really there, hovering like a real clippy, waiting to distract yourself with a notification at any time. And, whatever your feelings about smart glasses that have a screen is a good thing, because the display is the reason why you could spend $ 800 to have a pair. Once your eyes adapt to the screen (it took me about a minute), you can get cracking. This is where the neural meta band comes into play.

The Neuronal group is Meta’s SEMG bracelet, a piece of technology that it has been showing for years that has been reduced in the size of a Fitness Whoop group. He reads electrical signals in your hand to record pliers, scales, taps and wrist turns in the glasses. I was worried at the beginning that his bracelet could be clumsy or too visible on my body, but I can inform you that this is not the case – it is about as light as possible. Intelligent glasses were also light and comfortable on my face although they are significantly thicker than first generation rays.

Meta-banque display
© James but / Gizmodo

More importantly than being light and subtle, it is very responsive. Once the neural band was tight on my wrist (it was a little loose at the beginning, but better after my adjustment), using it to navigate the user interface was quite intuitive. An index and a pinch of the thumb are the equivalent of “select”, an intermediate finger and a pinch is “back”, and to scroll, you make a fist, then use your thumb as if it was a mouse made of flesh and bone above said fist. It’s a bit of professional vision and a little quest 3, but without monitoring of the necessary hands. I will not lie to you, it looks like a little magic when it works fluidity.

Personally, I still had a certain variability on inputs – you may have to try to grasp something once or twice before recording them – but I would say that it works well most of the time (at least much better than you expect for a literal literal apparatus). I suspect that the experience will only be liquid over time, and even better once you really train to navigate properly in the user interface. Not to mention the applications for the future! Meta already plans to launch a writing feature, although it is not available at launch. I have a first -hand look… a little. I could not use the writing myself, but I looked at a Meta REP using it, and it seemed to work, even if I have no way of knowing how much until I use it for myself.

Meta-banque display
© James but / Gizmodo

But enough on the controls; Let’s move on to what you really do with them. I was able to briefly experience everything that Meta-Ray-Ban’s display has to offer, and this includes the range of adjacent features on the phone. One of my favorites is to take photos in POV mode, which imposes a window on the display of the glasses that shows you what you take a photo of the right in the goal – finally, no assumption and check when you take photos. Another “wow” moment here is the possibility of pinching your fingers and refining your wrist (as if you are turning a dial) to zoom. This is a subtle thing, but you feel like a sorcerer when you can control a camera by simply waving your hands.

Another remarkable feature is navigation, which requires a card on the display of glasses to show you where you are going. Obviously, I was limited to testing the functioning of this feature because I could not walk with the glasses in my demo, but the card was fairly clear and bright enough to be used outside (I tested this thing in sunlight, and the brightness of 5,000 nits was sufficient). Meta leaves you, whether you use navigation while you are in a vehicle or on a bicycle, but that will warn you of the dangers of looking at a screen if you detect that you move quickly. It is difficult to say how distracting a HUD would be if you are bicycle, and this is something that I plan to test entirely.

Neuronal meta group
© James but / Gizmodo

Another interesting feature that you could really use is video calls, which draws a video from the person you call in the lower right corner. The interesting part of this feature is that it is POV for the person you call, so that they can see what you are looking at. This is not something I would do in any situation, because generally the person you call want to see You And not just what you look at, but I can confirm that it works at least.

Speaking of working, there is also a live transcription function that can listen to in your environment and superimpose what the other says about the display of smart glasses. I had two reflections when using this functionality: the first is that it could change the game of accessibility. If your hearing is altered, be able to actually see A live transcription could be extremely useful. Secondly, such a functionality could be ideal for translation, which Meta has already thought in this case. I did not have the chance to use smart glasses to translate another language, but the potential is there.

A problem I predict here, however, is that smart glasses can take other conversations that occur nearby. Meta also thought of that and said that the microphones of the Ray-Ban display really made the beam shape to focus only on who you look at, and I had the chance to test this. While a Meta REP spoke to me in the room, others had their own conversations with a fairly normal volume. The results? A little mixed. While the transcription focused mainly on the person I watched, she still picked up stray words here and there. It looks like inevitability in noisy scenarios, but who knows? Maybe the beam shape and AI can fill the gaps.

Meta-banque display
© Meta

If you are looking for a functionality of Meta Display Ray-Ban smart glasses killers, I’m not sure there is necessarily one, but one thing I know is that the coupling of glasses with its neuronal group should simply be a game changer. Navigating in the user interface in smart glasses has been a constant problem in space, and so far I haven’t seen what But on the basis of my first demos, I would say that Meta’s “brain link” bracelet could be possible the breakthrough that we expected – at least until the follow -up of the hands or eyes on this scale becomes possible.

I will know more about the way everything works when I am lucky to use a meta-ray-ban screen by myself, but for the moment, I would say that Meta is always clearly the precursor of intelligent glasses race, and its headline has become quite massive.


https://gizmodo.com/app/uploads/2025/09/ray-ban-display-hero-1200×675.jpg

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *