Your membership has expired

The payment for your account couldn't be processed or you've canceled your account with us.

Re-activate

Save products you love, products you own and much more!

Save products icon

Other Membership Benefits:

Savings icon Exclusive Deals for Members Best time to buy icon Best Time to Buy Products Recall tracker icon Recall & Safety Alerts TV screen optimizer icon TV Screen Optimizer and more

    Can Meta Glasses Guide the Blind?

    A recording artist born with impaired vision used the Ray-Ban AI glasses to help navigate during a cross-country trip. Here’s what she found.

    Lachi wearing Ray-Ban Meta smart glasses. Photo: Lachi Music

    Lachi is a recording artist, record producer, author, and disability culture advocate who is legally blind as a result of a congenital eye condition called coloboma. As part of a planned expansion of CR’s product accessibility coverage, we asked Lachi to evaluate Meta AI glasses as a navigational tool while she traveled the country last year. This is not a formal product evaluation, but CR did buy the glasses at retail just as a consumer would.

    As a touring artist constantly on the go, I’m always exploring new tools for navigating the visual world independently. So when Consumer Reports asked me to test and reflect on the Ray-Ban Meta AI glasses, I was intrigued. I’m what the blind-world calls a "high partial"—a blind individual with a pinch of usable vision. Since my sight can’t be corrected with a prescription, I rely on adaptive tools like magnification, screen-readers, and my rhinestoned Glam Canes as I outpace my sighted friends.

    These sleek, stylish Ray-Ban frames promised to extend my independence, offering capabilities like photo recognition, real-time object descriptions, and voice interaction. So I brought ’em up along for my recent travels from my home base in New York City to Los Angeles and Mississippi to see if they would offer practical support in my day-to-day. What I found was a mixed bag of impressive features, marked limitations—particularly when viewed through the lens of accessibility and data equity—and great potential.

    First Impressions of the Ray-Ban Meta AI Glasses

    Let’s start with the unboxing—often a real hassle for blind folk. The packaging was cleanly designed and once I got it open it was well put together, but that tiny pull tab was no joke—I had to search for it like it owed me money! Simple design switch-ups there could go a long way for a product being marketed to the blind community, just by making that pull tab larger or more tactile. Once I found the tab and pulled it, the glasses emerged. These babies are light and sit comfortably.

    Once they were charged and connected via Bluetooth to the Meta AI app, I took ’em out for a spin. Off the bat, I was pleasantly impressed by how well the visual assistant feature captured and described the scene. “Hey Meta, what am I looking at?” became a go-to phrase, prompting the glasses to snap a picture and give audio feedback right to my ears. It recognized a car dashboard, gave a shout-out to the infotainment center, and even correctly guessed the vehicle was a luxury model. Is it because they knew Lachi only travels in style? Who’s to say?

    Ray-Ban Meta smart glasses.

    Photo: Consumer Reports Photo: Consumer Reports

    Speaking of Privacy

    Speaking of it knowing way too much, the process to get the glasses up and running requested access to my call history, photo gallery, music, contacts, and more—which quickly turned my flags from rose-colored to red. In a time when concerns for data privacy are right up there with the cost of eggs, the level of access required to use the darn things effectively made me pause. While I’m no stranger to sharing my life online—hello Instagram—I’m like, why does a pair of shades need my entire photo history and call logs?

    More on accessibility

    Here’s a weird little incident. I asked about the weather, and it gave me the forecast for Los Angeles—even though I hadn’t enabled location settings. So how did it know where I was? Possibly Wi-Fi triangulation or some other digital breadcrumb. But then, when I asked it, “Where exactly am I?” it told me to turn on location settings. Babe, you’re giving me mixed signals. Literally! [Editorial note: Meta did not respond when CR offered a chance to respond to this and Lachi’s other assessments.]

    Interestingly, the glasses could describe people’s clothing, hair color, facial hair, accessories, and even the brands of people’s shoes and phones. But ask about someone’s gender or race, and you get either the silent treatment or some version of “Not today, my friend.” I get the intent: privacy, safety, bias mitigation. But as a blind person, I felt the absence of those visual identifiers. If I’m trying to find a friend in a crowd or understand an image, those details are more than just curiosities—they’re part of painting a fuller picture of the world around me. I do like that the shades are integrated with the Be My Eyes app, an online platform where sighted volunteers can be piped into blind folks’ smartphones to help them navigate or read something on their screen or from their camera. The app’s "Call a Volunteer" option can be activated from the settings for those who’ve relied on or prefer that service.

    I enjoyed that the glasses allowed me to go toe-to-toe with my mortal enemy—street signs. It was able to tell me the street corner I was on while on a walk, and could even catch signs I was passing while riding a car service so I could know how far along we were. However, when I asked it to read a license plate, it couldn’t do it—even though it could read a street sign at the same distance. I wondered if Meta is making deliberate choices about what info it will or won’t deliver. If I witnessed a crime, how would I describe the suspect and their getaway license plate? Perhaps I could just take a picture, or do like any true red-blooded New Yorker and "play blind."

    Conversations and Context With Meta Glasses

    Another checkmark in the plus column goes to when I asked the glasses to suggest a response to a text thread. It offered not one, not two, but a whole batch of cute, casual replies—from playful banter like “Haha, same here!” to more engaging prompts like “Ask how she’s doing.” The range of responses felt personal and intuitive. For someone like me who’s often on the go, juggling music, advocacy, and constant communication, that kind of conversational assistance is a good’n.

    I caught a bad case of the tech hiccups when I asked the glasses to help me change the voice setting. The AI got a little sassy, like a slightly annoyed IT guy. “Go to settings,” it repeated. “Choose language and voice,” it repeated (subtext: “like … obviously”). What it didn’t tell me was how to actually find those buried settings. After a toiling maze through submenus within submenus, I finally discovered it. The AI had no interest in walking me there—just insisting it was “under settings.” Typical tech support vibes. 

    One last thing about voice—more specifically, my voice. There’s no voice recognition with the glasses. This may be a missed opportunity. If the glasses could recognize my voice and only respond to me, it would be a step closer to real independence, and a deterrent for theft.

    Independence vs. Interdependence

    Navigating airports offered a different layer of insight. At the New Orleans airport, I asked the glasses to help me find my gate. When I looked at a departure board showing multiple flights to New York, the AI guessed one and—lucky for this traveler—it happened to be correct. What it should have done is tell me there were three flights, ask a question to determine which of the three were relevant, or made clear it was an ambiguous answer like “there are three flights, one of which leaves at such and such time from such and such gate.” But the pure confidence it displayed when taking a 1-in-3 chance of a correct answer underscores that the glasses can “see,” but may not fully understand context quite yet. And airports are busy and hectic places to stand around and ask multiple questions in a row to properly orient. I’d love a future where I could simply say, “Hey Meta, I’m at Louis Armstrong Airport—navigate me to the correct gate for flight number 123.” 

    Navigating the different airports with the glasses made me reflect on what independence actually looks like. Yes, the glasses can be super helpful. But I still needed my Glam Cane. Still had to rely on my own spatial awareness as a high partial. Absolutely needed Wi-Fi—not always available in airports, subways, rural areas. And I needed a good chunk of time to navigate the public space. Honestly, I often find quicker success roping fellow humans into my quest—airline staff, fellow friendly travelers. But here’s the thing: I don’t view that human interaction as a failure of independence—instead it is an extension of it. I’ve always believed in the power of interdependence. To me, interdependence is independence. I use my charm, my wit, my smile. I strike up conversations and build little human bridges that help me get from Uber to gate faster than any gadget could. And sometimes I leave with a new friend or a couple extra Instagram follows.

    Final Thoughts on the Ray-Ban Meta AI Glasses

    All in all, a positive experience. The Ray-Ban Meta smart glasses aren’t perfect—but they are promising. For blind users like me, they offer a fresh and stylish go at leveling the playing field. They can describe objects, suggest texts, and even recognize street signs and dashboards, and with deep integrations between one’s phone and the shades, they can become a pretty formidable AI assistant. But they also rely heavily on internet access, require significant personal data (ah, the price we pay for convenience!), and raise questions regarding full visual equity when excluding details like race, gender, and even eye color from their descriptions.

    Ultimately, the glasses didn’t replace a sighted companion—and maybe they weren’t meant to. But they did serve as a helpful tool, a conversation starter, and a glimpse into a future where wearable AI could truly transform how we experience the world. The challenge will be building that future with accessibility, privacy, and nuance at its core right from the start.

    So, would I recommend the Ray-Ban Meta glasses? For blind and low-vision users, they’re a fashionable, functional, smart, and exciting peek into what’s possible—but they’ve got a ways to go for more fully integrated independence. They represent a step forward. A new way to engage with the world.

    But until the tech can keep up with the full picture? I’ll keep rocking both—smart glasses in one hand, and my Glam Cane, my community, and my problem-solving skills in the other.


    Lachi

    Lachi

    Lachi is an award-winning recording artist and author, music executive, disability inclusion advocate, host of the PBS series “Renegades,” and Recording Academy (Grammys) National Trustee. Born legally blind, Lachi created the organization RAMPD (Recording Artists and Music Professionals with Disabilities), collaborating on disability-inclusive solutions with the likes of Netflix, Live Nation, and more, while bringing career opportunities to music creatives with disabilities.