[ad_1]
Last year, Google announced a new app to help visually impaired people named Lookout. The application uses artificial intelligence to identify objects with your phone's camera. It can also read text in the form of signs and labels, scan barcodes and identify currencies. This week, Google announced that Lookout would finally be available for download – but only for Pixel devices in the United States.
Since it announced the application last year, Google claims to have "tested and improved the quality" of its results. The company warns that, as with all new technologies, Lookout's results will not always be "100% perfect", but will solicit feedback from early users.
To use Lookout, Google recommends users to wear their Pixel device on a thong around the neck or in the front pocket of a shirt or coat. In this way, the phone's camera gets an unobstructed view of the world and can identify objects and text "in situations where people might usually have to ask for help".
It's unclear when Lookout will be available on non-Google hardware, but the company said it hopes to be able to offer "more devices, more country and more platforms".
Fortunately, this is not the first time we've seen a big tech company apply artificial intelligence to the task of helping the visually impaired. In 2017, Microsoft launched an application with very similar features, named Seeing AI. The Redmond company announced this week an update of Seeing AI allowing users to feel the shape of objects on the screens of their phones through a haptic feedback.
[ad_2]
Source link