Reflections on the rapidly evolving AI of Apple



[ad_1]

WWDC 2018 introduces significant improvements on Apple platforms, here are some ideas of what to expect

Apple Maps (soon)

We recently learned that the fleet Apple's Car Maps uses LIDAR (Light Detection and Ranging) sensors when capturing data at the places they visit.

An easy way to understand what this technology does is iPhone X, which projects hundreds of small spots of infrared light onto a user's face

This is not a LIDAR system, in itself, but the technology works a bit like that, capturing multiple data points on a place. These data help AI to build an image of the place, an image that theoretically should allow embedded technologies to understand the difference between a streetlight and a pedestrian on a dark night.

My best guess is that in the future you will never get lost as you point your iPhone camera anywhere and Maps compares the image with its own data to tell you where you are.

Magic Sudoku

Imagine the AI ​​that can take an image, understand the image, and then calculate the appropriate response to what he sees.

This is a few steps beyond a simple intelligence model of query / response based on data as it requires a deeper cognitive intelligence capable of responding correctly to challenges [19659004] Magic Sudoku is an application that solves Sudoku puzzles for you: Point your camera at an unresolved Sudoku puzzle and you will understand what you are looking at and you will get the correct answers in seconds

This is a direct application of how AI / machine learning can increase your reality with the tools you will use. After all, if you can solve Sudoku, what other problems can you solve? At what point does artificial intelligence begin to solve real-world problems, such as determining what is wrong with mechanical devices? We know that some large companies are already developing virtual reality-based maintenance manuals and consulting systems.

PupCam

Snapchat for dogs? I think so – and built in a few weeks in 2017 by the Octavian Costache developer using TensorFlow, Core ML (with a little help), Swift, ARKit, SpriteKit and a lot of work. PupCam is available in the App Store.

It's not going to change the world by putting silly faces on animal pictures, but it shows how Apple provides developers with the tools they need to build complex enough artificial intelligence projects . enable the creation of much deeper solutions to real-world problems over time.

"Just two years ago, for a similar technology (on human faces), SnapChat had to buy a company for $ 150 million". "Now, iOS ships free detection of landmarks."

PupCam illustrates Apple's rapid progress toward creating and delivering pre-trained machine learning models that developers can use to create real solutions. creating object and content management systems for AR, Apple is doing everything it can to create playgrounds that accelerate the development of IA and AR

Polyword

Siri in iOS 12 will translate the languages. Duolingo is constantly very effective at teaching languages, but if you want to learn a new language then you should really think about installing Polyword on your iPhone.

When you do, you can point your camera at an object and the application will provide you with what this object is called in one of thirty different languages.

This is an excellent (and frictionless) illustration of the power of computer vision and machine learning. This is the same kind of combination that gives you these street sign recognition and translation applications that will inevitably one day be installed as standard in vehicle mapping and control systems. Your vehicle will know how fast it should be traveling even if you do not do it. Siri will already know what temperature you want your hotel thermostat to adjust itself too.

Measure, Magic Plan, IKEA Place

Coming soon in iOS 12, the Apple Measure app will allow you to accurately measure the distance between two points. It is essential that the company looks for ways to share real world objects in the virtual space.

Applications like Magic Plan and IKEA Place show how such precision can be used constructively: Magic Plan allows you to create floor plans. Combine all three and you can meet the challenges of the real world in terms of interior design, placement of objects and – perhaps the biggest problem: make sure you can get bigger furniture in your house.

Replace the bureaucracy

Another example of artificial intelligence combined with computer vision, Fyle allows you to capture commercial receipt images using the camera on your iPhone, then use the optical character recognition to extract important data. The app will also create expense reports in PDF, track vehicle mileage, track duplicate bills and more.

It is not too difficult to see how automated accounting systems could reliably handle such tasks. This raises the question: as machines become more familiar with words and numbers and will be better able to understand and execute complex tasks, how many other jobs are traditionally manuals will be automated? Automation is not limited to the industry 4.0

One more thing

What I say is not a letter of love to technology, just a simple attempt to explain how much the development of AI accelerates

Carlos Guestrin, senior director of machine learning and artificial intelligence (AI) of Apple, has recently spoke at the CloudTech summit. His fascinating presentation underscores how these technologies are likely to change the status quo, which is why Apple and others have to think so deeply about the ethics of what is created and how it is used.

Google +? If you use social media and are a Google+ user, why not join the Kool Aid Corner community of AppleHolic and get involved in the conversation as we pursue the spirit of the New Model Apple

Got a story? Please send me a message via Twitter and let me know. I would like you to follow me on Twitter so that I can inform you of new articles that I publish and reports that I find.

[ad_2]
Source link