How Artificial Intelligence has Evolved in iOS and Where it May be Going

With the recent news that Apple is building a chip dedicated for artificial intelligence tasks on their mobile devices, it’s interesting to look back at where it all began — 10 years ago this June — with AI and the tech company.

We probably won’t know more about the “Neural Engine” chip until the iPhone 8 comes out later this year, but some speculate it would help with speech recognition, facial recognition, augmented reality, and improved performance.

Until we see what is to come, let’s look at how we got here.

iPhone (2007) / iPhone 3G, 3GS (2008)

The original iPhone was released on June 29, 2007. It introduced a slew of innovations, including a touch screen, visible voicemail, and fast Internet access. One of the first examples of AI could be seen in the touch keyboard. It used complex algorithms and machine learning to predict text and correct mistakes, making it much easier and more efficient to send messages and type emails.

The iPhone 3GS also added voice control, a very early, rudimentary precursor to Siri.

iPhone 4, 4S (2011)

Speaking of Siri, the AI-powered virtual assistant became integrated into iOS in 2011. While it wasn’t the first of its kind, Siri took virtual assistant standards to another level, leveraging machine learning to improve four key areas of functionality.

  • Speech recognition (to understand when you talk to it)
  • Natural language understanding (to grasp what you’re saying)
  • Execution (to fulfill a query or request)
  • Response (to talk back to you)

iPhone 5, 5C, 5S (2012)

With the launch of a new version of iPhone, updates to Siri followed. The software was improved to include the ability to make restaurant reservations, launch apps, dictate Facebook or Twitter updates, retrieve movie reviews, and detail sports statistics.

For Messages and email, speech to text was introduced, as well as more refined predictive and suggestive keyboard. A multi-language spell-checker piggybacked on the advancement of machine learning technology to recognize regional accents of different languages.

Apple developed its own Maps app and added turn-by-turn navigation spoken directions, 3D views in some cities, and real-time traffic status. With Siri integrated in Maps, users could audibly ask for directions and get guidance while driving.

The Passbook app used AI to offer context-aware features such as notifications for relevant coupons when in the immediate vicinity of stores. Passbook could alert users whenever it thought they might need one of their passes — notifications could be location-based, time-based, or both, depending upon the type of items stored in the app.

iPhone 6, 6 Plus (2014)

Siri got its first major overhaul in mid-2014 when its voice recognition was shifted to a neural network. This AI model, designed to simulate the human brain, is a combination of software and algorithms that improves over time by learning from its experience. Since this upgrade, the system now leverages deep machine learning techniques, including deep neural networks (DNN), convolutional neural networks, long short-term memory units, gated recurrent units, and n-grams.

“The error rate [was] cut by a factor of two in all the languages, more than a factor of two in many cases,” said Alex Aceron, leader of the Apple Speech Team during the Siri upgrades.

“That’s mostly due to deep learning and the way we have optimized it  —  not just the algorithm itself but in the context of the whole end-to-end product,” he added.

A new keyboard called QuickType was also revealed. Algorithms and machine learning analyzes users’ writing to predict what might be typed next based on the context of the sentence and a user’s unique typing style.

In 2015, the voice command “Hey, Siri” was introduced. It allows users to initiate the Siri app without pushing any buttons. The feature itself uses machine learning to allow the iPhone to keep an ear out for the command without draining battery.

iPhone 7 (2016)

Apple made artificial intelligence a cornerstone of iPhone 7 and iOS 10. Advances in AI and machine learning allowed its native apps to improve.

Maps automatically identified traffic jams and accidents to reroute users (or help users find their car in a parking lot). Siri can now interact with third-party apps to order pizza, schedule an Uber or Lyft, or start a workout. It integrates with other popular apps such as WhatsApp, Skype, WeChat, and Pinterest. Siri now sounds more like a real person, also thanks to machine learning.

Photos uses AI facial recognition to better identify people and objects in photos. For example, it can recognize a familiar friend and can tell that there’s a mountain in the background. The new Memories section in the app automatically organizes pictures based on events, people, and places, complete with related memories (such as similar trips) that can build smart presentations.

QuickType automatically translates text into emojis. The keyboard also leans on Siri intelligence to understand the context of conversations. It is aware of where users are and what they’re doing and can make educated suggestions. If a friend asks for someone’s email address or asks a user’s location, it can be shared with one tap. QuickType is now better at handling multiple languages and schedules.

Other features taking advantage of deep learning can be seen when …

  • The phone identifies a caller who isn’t a contact (but emailed the user recently)
  • A swipe shows a shortlist of apps most likely to be used next
  • Reminders are sent of appointments that were never added to the calendar
  • A map location pops up for a reserved hotel without being asked
  • News stories are displayed based on previous reading
  • Fraud is detected on the Apple store
  • Battery life is extended between charges

 iPhone 8 and Beyond

The tech industry and consumers are anxiously awaiting the announcement of a new generation of iPhone. Will it be September 2017? Sometime in 2018? It’s anyone’s guess. But it doesn’t limit speculation on the role AI will play in the iPhone 8 (or whatever Apple names it).

3D Scanners and Augmented Reality

Some claim a 3D face scanner is among the new phone’s features. It could use this scanner to verify a person’s identity for payments and other transactions. Augmented Reality (AR), a method for identifying faces and objects, is another rumored integration.

With AR, claims one Apple analyst Gene Munster, users will be able to point their phone and find their seats in a crowded stadium or desired groceries among the aisles.

Beyond voice or facial recognition, AR will help will be able to create a more complete picture of an individual. With the ability to quickly analyze massive amounts of behavior and data, mobile devices with AR apps will be able to recognize a person the way humans recognize other people — by unique characteristics.

Machine learning will be able to compare speech patterns, body language, and other traits. Apps that can detect and distinguish animals in the wild through a smartphone camera and display scientific name and other details is one possibility, among many.

Reliance on Siri and End of Roaming

We will lean on Siri more in the future, claims some industry experts. According to the Gartner Symposium, by 2019 consumer digital assistants will “recognize individuals by face and voice across channels, and by 2020 smart agents will facilitate 40% of mobile interactions.”

Gartner also predicts that virtual personal assistants agents will “monitor user content and behavior in conjunction with cloud-hosted neural networks to build and maintain data models from which the technology will draw inferences about people, content, and contexts.”

AI could also end roaming. Global, cross-border mobile usage is already a user expectation, but users will increasingly refuse to tolerate their phone being hamstrung by outdated networks. Robots could help find and switch device networks seamlessly.

Straight from the Apple’s Mouth

Apple recently announced the start of the “Apple Machine Learning Journal,” a blog with articles from the company’s engineers. Posts will describe work and progress they’ve made in machine learning technology for use in Apple’s products. The tech industry is intrigued, based on how historically reluctant the company has been in sharing its work.

What will they reveal about where they plan to take AI in their future products? How will it impact their innovation? Only time, and future product launches, will show for sure.

Leave a Reply

Your email address will not be published. Required fields are marked *