Apple is using artificial intelligence to carve out new features

Be it the Watch or apps on your iPhone and iPad, Apple is leveraging greater powers of AI, to get into the battlefield


Apple Watch.
Apple Watch.

Artificial intelligence is the buzzword with almost all things tech these days. And Apple is also getting its say in the matter. It was Apple that started the entire virtual digital assistant craze with the Siri on iPhone, many years ago. But while Siri improved incrementally, it was overtaken by rivals such as Google’s Assistant, and faces competition from Amazon’s Alexa and Microsoft’s Cortana too. It isn’t to say that any one of these is the definite winner just yet—in fact, far from it—but the battlelines have been redrawn.

Apple is now using machine learning more than ever before. Be it on the Apple Watch faces, photos for face recognition and more. Developers will get a new set of APIs (application programming interface) to be able to integrate these in their apps. It really is game on now.

Apple says as many as 375 million iOS devices use Siri every month, globally. Once iOS 11 rolls out and you install it on your iPhone or iPad, the virtual assistant Siri will leverage the power of deep learning techniques, to add new features to its offerings and learn user preferences constantly. In fact, whatever it learns on one device will also be synced to all other Apple devices that you may own, so that the same experience is available across. There will now also be multiple suggestions to your queries, to better suit the kind of personality sketch Siri draws up about you over time.

It is not just the phones and the tablets. Siri will leverage the same AI learnings and capabilities to work on the HomePod smart speaker. The ability to talk to you and respond, the ability to curate an Apple Music playlist for you, and the positional awareness to modify sound output, all come from different types of machine learning and AI techniques.

All iPhones and iPads running iOS 11 will also get the refreshed Photos app, with machine learning to identify events, memories and more. The use of artificial intelligence, particularly for sorting photos based on time and date and even events is something that will be the most used AI prowess, for most consumers—who would mind the magic of having a friend’s birthday party pictures getting sorted automatically, without the Photos app having any context or data provided by the user? AI will also allow users to do new visual tricks with Live Photos that they click, including a long exposure blur effect or a bounce effect clip.

It is a fact that iOS 11 will be integrating AI deeper within multiple apps and facets of the software, more than ever before. It is not about one app, but smarts that are spread across multiple apps that users tend to access regularly.