Google: The Future Is No Longer Mobile, It’s Artificial Intelligence

Goodbye mobile first, hello AI first. Today at the Made by Google event, the company not only released new hardware, they made it crystal clear that the future of technology isn’t what’s in your hand, but the lines of code within. For each new or updated piece of hardware, including the Pixel 2, some spiffy new earbuds, and a new type of camera, Google jammed both their Google Assistant into it and focused on how each device uses machine learning as a form of personalization.

For each device, especially Google Home and the Pixel phones, AI will make it easier to get information we typically use Google search for. Not only this, they are further integrating their other devices, in particular the Nest lineup, to better work together. Here’s a few of the ways Google is putting a greater focus on AI and their Google Assistant:

Pixel 2

Google Assistant was already built into the initial Pixel and Pixel XL; however, two updates put AI even further into the spotlight. For starters, the Active Edge feature allows you to squeeze your Pixel phone to quickly activate Google Assistant. This is in addition to triggering it by saying “Ok Google” or manually accessing it through the search bar.

In addition to access, Google is further updating Google Lens, and the Pixel phones will be getting a preview version of it. With Google Lens you can simply use your phone like an AR looking glass. While holding your phone up to a print ad, it can pull out the phone number or email address. Scan the Pixel 2 against a person, album cover, or a slew of other identifiable objects and through machine learning it will provide you information about it.

Google Home

Google Home, which now has a Mini and Max version, will further roll out it’s voice match feature. Unlike Alexa, which takes in all voice commands from anyone, including your TV, as the same person, Google Assistant can learn and recognize who each person is.

Through the Voice Match feature, Google Home will give you personalized results rather than a single output. Take for example Google Calendar. Just like your coworkers, many people in your family likely have their own calendar. Rather than just spitting out the next event or dates, Voice Match knows who is asking for updates, and gives personalized information.

Pixelbook

Google Pixelbook

Not only is the tech giant going after the incredibly popular Microsoft Surface, the design of their new Chromebook mirrors that of the Lenovo Yoga line. At face value, this is a rather expensive Chromebook, but this too is getting the AI treatment.

With a dedicated Google Assistant button, users will be able to access its features and commands faster than ever. It’s also going to be paired with the new Pixelbook Pen, a $99 device using Wacom’s technology, that lets you both draw like it were a stylus or select items for Google Assistant to provide you information. By circling or highlighting text, images, or really anything, Google Assistant can then be triggered to give you more information about it.

Pixel Buds

Google Pixel Buds

One of the surprises at today’s event was the announcement of the Pixel Buds. These wireless earbuds include gesture control, direct access to Google Assistant, and active voice translation as well.

Read more about the Made by Google event on TechCo

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Elliot is an award winning journalist deeply ingrained in the startup world and is often digging into emerging technology and data. When not writing, he's likely either running or training for a triathlon. You can contact him by email at elliot(@)elliotvolkman.com or follow him on Twitter @thejournalizer.
Explore More See all news
Back to top