Google announced a number of AI-driven product enhancements at its annual Google I/O conference on May 10. It is delivering materially on its positioning as an AI-first company. To Google that appears to mean injecting AI into all products and services to make them smarter and more useful.
Just before the keynote, Google announced it is rebranding its Google Research division to Google AI. The move further emphasizes how Google is focusing R&D on computer vision, natural language processing, and neural networks. As part of this process, it is also subtly expanding the scope and the possibilities of Search.
Google Assistant Gets Really Helpful
Google Duplex is taking voice and AI to a new level as Google CEO Sundar Pichai demoed the Google Assistant calling a hair salon and making him a hair appointment. Some people declared the demo passed the Turing Test, where the AI is so good that the person does not know they are talking to a computer. Google will make conversation more natural by not requiring the “Hey Google” or “Ok, Google” and using more natural timing and cadence.
While this technology may not have a direct relationship to Search on its surface, on closer inspection it does bring up some interesting questions. Voice is already one of the most significant transformations to the way people search online, often presenting direct verbalized answers to users’ questions and bypassing the SERP altogether.
Voice already represents a full 20% of Google searches made on mobile and Android devices as of 2016, and one in six adults in the United States possesses some form of “smart speaker” device as of 2017. It’s clear that this is a technology set to become a potential tidal wave. That said, Pichai’s product demo is another reminder that as search marketers we must continually question whether we’re looking at SEO for Voice the right way, as we do for any other type of optimization.
If the future of Voice is moving from queries like “which hair salons are open on Tuesday?” to directions like “make me a haircut appointment on Tuesday,” how will that affect SEO? If a user issues a direction to buy batteries, but doesn’t indicate a brand or ecommerce platform preference, how will the voice assistant decide which options to present first? Will it present options at all? It is an opportunity for brands to take advantage of even more certain and commercial search intent.
Gmail Starts Suggesting What to Write
In addition to the existing smart reply feature, Gmail will soon be able to suggest full sentences for you while you write with its new Smart Compose capability. Machines are already writing more than most people realize, playing a role in sports, weather, and simple news updates. Companies that use machines to write, do not usually publicize it.
Google Maps Will Augment the Reality of Your Pedestrian Navigation
Google Maps is getting more friendly and social with its For You tab where it will feature new and interesting restaurants and businesses in your neighborhood. And it will soon combine street view and camera to add augmented reality directions that will help you navigate the real world with a layer of video and audio.
As we discuss in more detail below, Google is expanding aspects of its search experience–in this case local search results–and surfacing them areas other than a basic search results page. Someone walking down the street in an unfamiliar city may not have the time or inclination to navigate away from GPS navigation in Google Maps to look up “restaurants in my area,” so now they could have the results of that local search query tiled over an augmented reality viewpoint. This opens up another avenue for local SEO to get in front of potential customers at exactly the right moment.
Additionally, this move is a reaction to localized vertical search engines like Yelp and FourSquare. By answering the need to present a competing feature set to these VSEs, Google indirectly acknowledges that VSEs are now a significant force in Search.
Google News Will Let You Personalize Your News Without Becoming too Insular
Google News will also being using AI to find and curate news customized for the user. Additionally, it is designed to provide you “a range of perspectives” to expose users to ideas outside their core. Google seems to be doing some beneficial social engineering to provide personalized content while keeping society from becoming too fragmented and insular.
Google Lens Will Help You Make Physical Virtual
Google Lens will help you take real-world text and make it editable and available to use inside your phone. And it you can point it at a product and it will help you find a place to purchase it online.
Google is in a continual process of refining its search experience to present users with the exact content that they need in response to the exact query or intent that they have. This was the rationale behind all of the named search algorithm updates as well as the introduction of the customer micro-moments model.
By introducing contextual ecommerce listings to Google Lens, Google is applying the same philosophy horizontally across its technologies. An outgrowth of the same strategy can be seen in the recent addition of ecommerce aspects to Image Search via structured data tags. Google is trying to get even more dialed in on micro-moment targeting by expanding the breadth of where search results-style listings are presented to the end user. Previously these listings may only have been surfaced to the user through the traditional SERP — but is that where every digital audience is beginning its customer journey now?
You have heard of voice search, but have you heard of thought search? Happening not in 2028 but in 2018
As futuristic as it sounds, an MIT student has a working proof of concept of his AlterEgo product that allows him to think a search, search Google, and get the answer silently vibrated into his ear. Arnav Kapur was featured on “60 Minutes” and did calculations, found trivia information, and ordered a pizza using only his thoughts. The device is not connected to his brain, just worn on Kapur’s head.
When the user thinks and internally vocalizes a specific command or question, it is conveyed to a computer and Google — sort of like silently Googling something in your head. Electrical signals that the brain normally sends to the vocal cords are intercepted and sent to a computer and that information is then communicated to the user’s inner ear via vibrations. Google is expanding not just the places where search happens, but the ways in which we conduct searches altogether.
How does BrightEdge use AI for SEO success?
BrightEdge Insights uses deep learning and big data to recommend the critical actions that will drive the biggest impact for your business. Insights gives you the confidence that you are focusing on the content or optimization issues that will drive the biggest impact today. Insights prioritizes all of the actions in one simple feed, placing the most critical actions at the top. Instead of spending time on data extraction, manipulation, and syntheses, you will get back hours each week by taking recommended actions from Insights that yield quick wins in SEO. With artificial intelligence, Insights does the heavy lifting to uncover the most relevant actions you should take. Insights looks at millions of web pages, examines thousands of changes each week, and boils them down to a few key findings that will move the needle for you.
What do these advances in AI mean for marketers? The advice remains consistent: build relevant, quality content that provides a great user experience. As Google uses RankBrain to understand intent at higher specificity, more detailed and specific (longtail) content is likely to be in first position. Additionally, watch for developments in technical SEO and markup that will make your content more discoverable and desirable to the AI-savvy bot and algorithms.