October 7, 2025

Can intelligent glasses save the gadgets from some destiny?

0
ray-ban-meta-gen-2-07-1200x675.jpg


I think it is prudent to say that AI gadgets have a difficult time at the moment. Humane and its AI pin are, at this stage, completely in retreat, the company dissolved and sold at HP. Humane’s counterpart, Rabbit, although he recently introduced a major software update of his R1 device, but that is almost everything I could say in the positive on the orange gadget.

What is worse is that the potential saviors of the AI ​​gadget category, Sam Altman and Jony Ive, who announced in May that their plans to create the next major device via a new company called IO, also seem to wade. A new Financial Times report this week suggests that Altman and Ive find it difficult to make their gadget (a “palm size”) useful, private or even fully fueled the thing with the cloud. Not great.

It’s a lot of tumult, but despite all these misfortunes, there is A Device with “ai” in the name that does not seem to be on the slope, and he does not live in the palm of your hand or on your shirt; He lives on your face. I’m talking about smart glasses, specifically Meta’s “AI” glasses. Smart glasses have a moment right now, and Meta is in the center, thanks largely to the Ray-Ban brand. It may seem weird to group Meta’s smart glasses with devices like R1 or AI pin, but Meta would disagree; The AI ​​is at the center of its smart glasses, especially its version without screen.

Rabbit gadget r1 ai
© Raymond Wong / Gizmodo

If you are not familiar with the Ray-Ban Meta AI glasses, the only thing you need to know is that “Hey Meta” is the key to making them feel smart. There is an integrated voice assistant that you can use to play music, take photos and videos, call people with WhatsApp or Instagram, and do typical vocal assistant stuff like checking the weather or the life of your battery. This is right that “Hey Meta” begins however. In addition to being a vocal assistant, the Ray-Ban Meta AI glasses also have a computer vision.

Thanks to cameras and microphones on AI glasses, you can (in theory) use Meta AI to do a lot of things that an ordinary vocal assistant cannot do, as translated from text and speech, give you a context on a work of art or a product in a store, or describe the things you look at, which is a fairly practical capacity from an accessibility point of view. What separates the AI ​​glasses from Meta from the ai pike of Humane or Rabbit R1 is that smart glasses are a factor of shape that people really want to use if sales are an indication. Meta has sold 2 million pairs since the release of the AI ​​glasses in 2023, which is not the level of iPhone, but is certainly not a bad start for a relatively new category of device. Compare these figures to the 10,000 AI pins sold by Humane, and success seems even more promising.

Do not get me wrong, people buy smart glasses for many reasons, almost almost none of them linked to AI, but it is not because it does not mean that people will not use it, and maybe even more than similar tools on a phone.

While companies like Google have relied heavily on the AI ​​on pixel devices like Gemini, the adoption was lukewarm; The features are new and awareness is still low. There is also more friction with the use of computer vision on a particular phone, because you have to release your phone, access a feature, then do what you have done. With smart glasses? Not as many barriers. Computer vision is a more natural part of how you use them, in particular due to the emphasis on voice commands instead of a tactile user interface. With smart glasses, your device has always been out, and it is always pointed out of the thing you want. It is the form factor (and in some respects the restrictions of this form factor) which gives intelligent glasses the edge on the pins and computers the size of a card and even the omnipresent glass slab which is your phone.

Rey Baan Meta Gen 2
© Raymond Wong / Gizmodo

Now, if these features focused on computer vision are sticky is another question entirely. Having used the Ray-Ban Meta Gen 1 and Gen 2 rays in depth, I can tell you now that the commands have more advanced on smart glasses are at best struck, even when you really want to use them. I will give you an example.

I am at the beach with my mother, chasing the teeth of sharks in surfing, and I meet a shell that looks very much like it is a tooth. He glance Like the tooth of a shark, but how can you really be sure? With the Ray-Ban Meta glasses already on my face, I look at the shell (or the tooth) in my hand and asks: “Heya, how can I know if it is a shark tooth?” The answer? Yes, you hold a shark tooth. Great! The only thing is that all the other black shells I have picked up not A tooth was Also A tooth depending on the glasses. Not so great.

On the one hand, my request is a challenge should shine. But whether my answer is satisfactory or not, the fact that I even took the trouble to use Meta Ia says a long time, and that’s more than most AI devices cannot claim. The familiarization of people with the characteristics of AI is an important part of the battle for AI -centered devices, and training, or God does not like, recycling, people to use devices are difficult. For Meta and its range of intelligent glasses increasingly congested, there is a long way to go before Meta Ai becomes useful, but in the pantheon of the AI ​​gadget failure, smart glasses could really fail.


https://gizmodo.com/app/uploads/2025/09/ray-ban-meta-gen-2-07-1200×675.jpg

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *