Microsoft on Wednesday released Seeing AI, an iPhone app that attempts to analyze its surroundings and describe them audibly for people with impaired vision.
Using neural network technology, the app can not only translate text but recognize people and currency, scan product barcodes, and offer a simplified description of an entire scene or imported image. In cases like barcodes and text recognition, audio cues guide users towards getting a solid lock. Some basic functions will work without an internet connection.
In analyzing people the app will not only try to name them if possible, but share details like estimated age, how far away they're sitting, and their emotional state.
Microsoft has been working on the app since Sept. 2016, and first demonstrated a prototype this March.
Seeing AI is a free download, and requires at least an iPhone 5c running iOS 10 or later. While it's currently available in the U.S., Canada, India, Hong Kong, New Zealand, and Singapore, the only language supported so far is English.
5 Comments
Seems like a great app but I cannot get anything to work without it crashing. But I'm running iOS public beta 11 on 7+ so maybe it's me.
While it works on mine (7+, iOS 10), the app is a bit clunky. It seems to be very much at the beta stage. There are multiple steps involved with each type of visual (text, product, person). In the case of "product" for example, it can only recognize bar codes, etc.
Nice start though! Kudos to Microsoft for thinking up something like this. (Maybe there are others in this space who do a better job, I don't know).
What a great concept.
I would love to test this, but I just can't bring myself to put Microsoft software on my phone. It's a great idea,
i know it's a "different" Microsoft now but still. Can't. Do it.
My experience with the App is abysmal to say the least. Great concept but unpolished execution.