The Best Use for Smart Glasses Might Have Nothing to Do With Entertainment

Smart glasses might be relatively new as a category, but I’ve seen them pitched in a lot of ways already. There are smart glasses for gaming, smart glasses for music, smart glasses for sports, smart glasses for productivity—the list goes on. As interesting and sometimes useful as some of those applications can be, there’s always one category that feels like it may be truly groundbreaking, and it’s got more to do with regular glasses than smart ones. I’m talking about accessibility. At CES 2026, I noticed quite a few smart glasses trends, but maybe one of the most interesting was that smart glasses seemed to be leaning into their potential as aids for people with low vision or hearing. Companies like eSight, for example, are pitching a new crop of smart glasses as a way to help people with central vision loss. © eSight The FDA-registered eSight Go glasses, for instance, are designed to process images in real time and then shift “visual information” to the periphery of their dual OLED display, where people with central vision loss can still see. On top of that, the display can be optimized in other ways for the wearer, offering up to 24x magnification, image stabilization, better contrast, and even adjusting colors for things like reading. I don’t have a visual impairment, so it’s difficult to say how well smart glasses like the eSight Go work, but the idea feels as game-changing as any other smart glasses feature I’ve seen to date, and the company isn’t alone in its pursuit. Other entrants like Cearvol and its Lyra smart glasses are gearing their specs towards hearing. At CES, Cearvol showed off what it’s calling NeuroFlow AI 2.0 Technology, which, despite its cringey name, could be useful for those who are hard of hearing. According to Cearvol, the hearing tech, which attaches to a pair of smart glasses, uses microphones and a neural network to “analyze acoustic environments in real time” and can enhance speech while also reducing background noise. Importantly, it can also reduce the sound of your own voice to make conversations sound more natural. Again, I don’t have a hearing impairment, so I’m not the market for these glasses, but one could see the potential. Obviously, hearing aids already exist, but not everyone likes wearing something in their ear, and if you’re the type of person who is already wearing glasses most (or all) of the time, then this could actually be a viable solution for a hearing aid in a form factor that’s more comfortable and requires that you keep track of fewer things. Those products are focused on accessibility, but there’s also potential for more general-use pairs of smart glasses, too. Meta’s Ray-Ban AI and Meta Ray-Ban Display, for example, are also leaning into applications with increased accessibility, recently launching features like “conversation focus,” which use their built-in microphones to augment speech. Then there’s also computer vision. As interesting as it might be to look at a menu in a different language and have your smart glasses convert it to your native tongue in real time, that’s not necessarily what jumps out to me when I think about groundbreaking features. Computer vision is vision, after all, and the ability to combine AI and cameras to let people know what’s in their environment without them actually having to see it, feels like it could genuinely be helpful. Accessibility, of course, isn’t quite as broadly appealing as smart glasses that will make you more efficient or smarter or make your life easier, though, so it’s no wonder that it tends to take a backseat. But just because it’s not pitched up front as a game-changing use case all of the time, it might just wind up being the most useful thing you can do with a pair of smart glasses, even if being more useful than Meta AI is a relatively low bar.
AI Article