Google today announced it’s making it possible to use the voice command “Hey Google” to not just open but also perform specific tasks within Android apps. The feature will be rolled out to all Google Assistant-enabled Android phones, allowing users to launch apps with their voice as well as search inside apps or perform specific tasks — like ordering food, playing music, posting to social media, hailing a ride, and more.
For example, users could say something like, “Hey Google, search cozy blankets on Etsy,” “open Selena Gomez on Snapchat,” “start my run with Nike Run Club,” or “check news on Twitter.”
At launch, Google says these sorts of voice commands will work with more than 30 of the top apps on Google Play in English globally, with more apps coming soon. Some of the supported apps today include Spotify, Snapchat, Twitter, Walmart, Discord, Etsy, MyFitnessPal, Mint, Nike Adapt, Nike Run Club, eBay, Kroger, and Postmates, Wayfair, to name a few.
If the specific voice command you would use to perform a common task is a little clumsy, the feature will also allow you to create a custom shortcut phrase instead. That means, instead of saying “Hey Google, tighten my shoes with Nike Adapt,” you could create a command that just said, “Hey Google, lace it.”
To get started with shortcuts, Android users can say “Hey Google, show my shortcuts” to get to the correct Settings screen.
The feature is similar to Apple’s support for using Siri with iOS apps, which also includes the ability to open apps, perform tasks and record your own custom phrase.
In Google’s case, the ability to perform tasks inside an app is implemented on the developer’s side by mapping users’ intents to specific functionality inside their apps. This feature, known as App Actions, allows users to open their favorite apps with a voice command. And, with the added functionality, lets users say “Hey Google” to search within the app or to open specific app pages.
Google says it has grown its catalog to include over 60 intents across 10 verticals, including Finance, Ridesharing, Food Ordering, Fitness, and now, Social, Games, Travel & Local, Productivity, Shopping and Communications, too.
To help users understand how and when they can use these new App Actions, Google says it’s building touchpoints in Android that will help them learn when they use certain voice commands. For instance, if a user said “Hey Google, show me Taylor Swift,” it may highlight a suggestion chip that will guide the users to opening the search result on Twitter.
Image Credits: Google
Related to this news, Google says it also released two new English voices for developers to leverage when building custom experiences for Assistant on Smart Displays, alongside other developer tools and resources for those building for displays.
The Google Assistant upgrade for apps was one of several Android improvements Google highlighted today. The company also says it’s adding screen-sharing to Google Duo, expanding its Verified Calls anti-spam feature to more devices (Android 9 and up), and updating the Google Play Movies & TV app to become the new “Google TV” app, announced last week.
On the accessibility front, it’s introducing new tools for hearing loss with Sound Notifications and others for communicating using Action blocks, aimed at people with cerebral palsy, Down Syndrome, autism, aphasia, and other speech related disabilities.
The features are available now.
Proactive Computing found this story and shared it with you.
The Article Was Written/Published By: Sarah Perez