In the pre-digital age, the blind had few options and fewer employment opportunities. You could learn Braille and get an education, but after that things were pretty bleak. If you weren’t musically inclined, you could sell pencils on street corners, scrape by on monthly disability checks or take a sub-minimum-wage job at the mop factory, where at least you would be surrounded by your own kind.  

The passage of the Americans with Disabilities Act in 1990 was a major boost for public accessibility and employment, but it was the home computer and the Internet that really blew the doors open. 

By the late ’90s, software developers were creating accessibility tools such as screen readers that dramatically increased the self-sufficiency of blind users. Screen readers read aloud the contents of word files, emails and websites while announcing every keystroke and offering an array of sound effects—pings and whooshes—to help us navigate the screen. My screen reader allows me to keep working in a way that wouldn’t be possible otherwise, and it allows me to take care of basic online drudgery just like a normal person. It’s not perfect; some websites remain inaccessible, and filling out forms can be a bitch, but I’m not complaining. While it tends to drive sighted people nuts, I barely notice my computer jabbering away at me all day. It’s just the voice in my head at this point.

Microsoft’s SeeingAI promised to allow blind users with smartphones to identify objects and people, read documents and in general visualize their surroundings.

As welcome as these tools were, before the recent introduction of artificial intelligence systems based on new machine learning models, most accessibility tools had their limitations. A few made me recoil in horror. 

There was the Be My Eyes app, introduced in 2015 by developer Hans Jørgen Wiberg and The Danish Association of the Blind. With the tap of a button, the app opened a video chat between the user and a volunteer somewhere in the world. This stranger, in theory, would then describe whatever was visible through the user’s SmartPhone camera. This always struck me as an appalling idea. For me, “self-sufficiency” means excising other people from the equation as much as possible. I simply wasn’t interested in asking some well-intentioned stranger in Galveston if this really was a dog I was petting. Luddite that I am, I still preferred the idea of a cold and clinical machine doing the describing. Better to keep it between me and the computer than drag in some nosy, sighted stranger.  

Two years later in 2017, Microsoft’s SeeingAI promised to allow blind users with smartphones to identify objects and people, read documents and in general visualize their surroundings. It was among the first accessibility tools to employ AI in such a prominent role, and it was an exciting and liberating idea. The problem was it didn’t work very well. It wasn’t terribly user-friendly (especially if you can’t see), and the resulting descriptions might be termed spotty at best. I gave up on it after realizing I still had to ask some handy sighted person to confirm what was in the day’s mail, where I could find the Little Debbie snacks in the grocery store, or the denominations of the bills in my wallet. Even though it didn’t work, it was still a fine idea.

Then in October, a blind friend who keeps up with the latest advances in accessibility tech told me about a new feature on the Be My Eyes app called Be My AI. Created by ChatGPT software developers, it promised to do everything SeeingAI had promised in 2017, but do it far better and without interacting with some pandering soul from Lord knows where. To illustrate, my friend sent me some examples that were pretty astonishing. Reading the descriptions of his kitchen and recording studio, I was able to easily envision detailed scenes, down to the cherry-red teapot on the white stove. It was precisely what I’d been after: clinical and objective observations made and read by a computer.

Accessibility tools aside, I’ve long had serious doubts about the virtues of the digital age. Instead of a world of ease and convenience, the digital age made our lives more complicated, paranoid, dangerous and, above all else, annoying. But there were those tools, right? And this new AI app seemed to be the next generation.

To hear fearmongering elected officials, pundits and mainstream journalists tell it, artificial intelligence represents a far greater threat to the future of civilization than climate change, Fentanyl, Donald Trump, the next plague and nuclear weapons combined. 

Tap a button, wait a few seconds, and ping! Up pops a richly, even absurdly, detailed visual description far beyond what most well-intentioned humans could muster. 

Setting aside that inevitable “world domination” and “extermination of the human race” business for the moment, the biggest immediate threat posed by AI seems to be its ability to do two things: the dissemination of untruths about a political candidate during an election and the creation of fake, naked pictures of celebrities. Undeniably terrifying as both threats are, from my perspective there might well be a bright spot hidden in AI’s alleged malevolence.

I decided to take my chances and downloaded Be My AI. Before I could access the app, I had to agree to heed two stern warnings:

1. As the app only works with photos and not video, Be My AI was not to be used as a navigational tool, especially if you’re trying to cross the street.

2. Do not use Be My AI to diagnose health concerns or figure out which prescription bottle is which.

After agreeing to follow both directives, everything was as simple as promised. Point the phone at something (or where you suspect something might be), tap a button, wait a few seconds, and ping! Up pops a richly, even absurdly, detailed visual description far beyond what most well-intentioned humans could muster. 

It isn’t perfect, but neither are we. Sometimes Be My AI missed or misidentified an object, misread text or made some wild guesses. Just like us. For the most part the information has been surprisingly accurate, and I found I could do normal, boring things like read tax documents, find an open stool in a bar without groping anyone and track down that elusive Little Debbie display at the store. The more I learned about my surroundings, though, the more I found myself asking, “My God, did I really want to know the place was such a mess?”

Beyond simply getting a mental picture of their surroundings, other blind folks put the app to other practical uses, both domestic and professional. 

Writer, performer and scholar M. Leona Godin, author of 2021’s “Their Plant Eyes,” says she uses Be My AI daily to read cooking instructions, check the ingredients on a new bottle of facial cleanser or get descriptions of photos posted online. “I’m currently using it to help me do research in the New York Public Library photography collection,” she says. “Even though it gets things wrong sometimes, it tells me enough to know whether or not I want to bring in a sighted human informant. That’s not to say that humans are always better. Be My AI doesn’t mind being pushed for more detail or even being told it’s gotten something wrong, unlike a lot of sighted people I know.”

Thanks to AI tools, a lot of blind folks are experiencing these mundane and banal details for the first time, and it can be a mind blower. 

And unlike most people, the system learns and improves as it gathers more information, all in an effort to become more accurate. The AI system accesses trillions of bits of information, a stockpile that’s constantly growing. In my experience, the system can discern moods, styles and general atmospheres. Along with helping me sort the mail and find an album I misshelved months ago, it can interpret facial expressions and recognize things you wouldn’t expect it to recognize, such as sock monkeys. It can distinguish between paintings, prints, sketches and photographs. Most interesting of all, you can actually watch it learn. If you take a series of pictures of the same thing, the system will correct itself and hone the details with each unique description.

Over time, my system even seems to be developing a dry sense of humor. Be My AI recently informed me that a man in a photograph “appears to be very pleased with himself.”

“Because Be My AI runs on GPT4, it’s not only useful for identifying and describing, it’s also fun to interrogate,” Godin says. “I can ask Be My AI to give me specific details about visual elements as well as questions about the context: the biography of an artist or the history of a photographic process.”

I get the sense most sighted people wander blithely through their days unaware of their overly familiar surroundings. Colors, shapes, framed pictures on the wall as you head upstairs, the faces of fellow commuters on the D train: They all fade into a background blur. Thanks to AI tools, a lot of blind folks are experiencing these mundane and banal details for the first time, and it can be a mind blower. 

It’s not normalcy, but it does flip the equation. However self-sufficient the blind may be, we still live on the losing side of a one-way mirror. Sighted people can stare at us, judge us, imbue us with whatever meaning and superpowers they like, and we may never know it. Tools like this allow us to look back at the world objectively, if through an intermediary, and see it far more clearly than you likely ever would. How many times have you paused to take note of the color and pattern of the carpets in a casino?

Demonic and power-mad though they may prove to be, if AI systems continue to provide tools and toys like this that allow me to work and function, then to hell with humanity. I’m throwing my hat in with the HAL 9000.

Your support matters…

Independent journalism is under threat and overshadowed by heavily funded mainstream media.

You can help level the playing field. Become a member.

Your tax-deductible contribution keeps us digging beneath the headlines to give you thought-provoking, investigative reporting and analysis that unearths what's really happening- without compromise.

Give today to support our courageous, independent journalists.

SUPPORT TRUTHDIG