Esto también se puede leer en español.

Leer en español

Don't show this again

Mobile Leer en español

Google's Android P will know what you want to do before you do it

The company's mobile operating system update will include App Actions, which applies AI to the actions you'll take within the apps on your phone or tablet.

google-io-2018-7352

App Actions is a new feature in development for Google's Android P. The feature will do things like prompt you to resume listening to music when you plug in to your headphone jack.

James Martin / CNET

Google's mobile operating system already tries to predict which app you'll want to use next. It's next update, Android P, will move on to "predicting the next action you want to take," said Dave Burke, Google's VP of engineering for Android.

Now Playing: Watch this: Android P gets smarter with a slew of AI features
2:02

You'll experience this in a variety of ways as you're using a phone that runs Android P, Burke said. For example, plug in your headphones, and you'll see an "action" on your screen that lets you press play on music you were listening to earlier. 

Actions will eventually appear as you're interacting with the launcher, Google's Play Store and Assistant. It'll also appear as part of smart text selection and Google search.

Now Playing: Watch this: Android P gets smarter with a slew of AI features
2:02

"The phone is adapting to me and trying to help me get to my next task more quickly," Burke said. It's one way Google is applying artificial intelligence to phones, he added, saying the move will make "Android smart by teaching the operating system to adapt to the user."

Android P will also show you what it calls "slices," or a small section of an app's interface, when you might need it. For example, Burke said, if you look up Lyft in Google search, a slice of the Lyft app will appear on the screen, giving you the option to start a ride request. 

Slices will eventually show up in a variety of places on your Android phone, but they'll first appear in search, Burke said. The goal is to "enable a dynamic two-way experience where the app's UI can intelligently show up in context."

Early access to the feature will begin in June.

Now Playing: Watch this: Robot or human? Google Assistant will leave you guessing
4:25

Android P will give Android gestures like the iPhone X: Google's vision of Android P is now a lot less hazy. But the company still won't tell us what the "P" stands for.

Google's Duplex could make Assistant the most lifelike AI yet: Experimental technology called Duplex, rolling out soon in a limited release, makes you think you're talking to a real person.