As Technical Marketers, it’s important that we research and understand the search landscape. It’s time we took a look at voice search.
Let’s first clarify that traditional text-based Google Search as we know it isn’t going anywhere. We all know, the SERP has become an ever-changing robust environment and the days of ten blue links are gone. Voice search is nothing new, it’s been around for quite a while, but with the release of Google Home, it has reignited the way we as search marketers look at it. With machine learning on the forefront, Google’s evolution into AI is rapidly underway, and voice search could be a foreshadowing for us all to prepare.
How are we using voice search?
Analysts predict that by 2020, 50% of all search will be made with voice, and 30% of those searches will be without a screen. Technology such as Google Assistant for Home, Amazon Alexa for Echo, chatbots and other voice-activated SDK’s from Microsoft, Soundhound and others will make this possible. Those statistics may sound scary to some marketers that rely on screens, and frankly unbelievable, which is why we need to take a few steps back. We will want to look at the ways that voice search is currently being used, before diving into how it could evolve.
The statistics above are from Google’s Blog in 2014, a little behind the times, but still in-line with how voice search is used today. As the charts show, voice search is treated more like an assistant and less like a search engine. A more recent SEER Interactive study highlighted that only 8% of daily voice tech users reported actively searching the Internet via voice. I likely follow 6% of those users on Twitter.
Joking aside, it shows that right now voice search is largely utilized as a shortcut assistant for very general tasks, such as checking the time, starting a timer, playing music, and alternating the volume. Pushing the invisible button for us, so to speak. Some aspects of search are going to be extremely difficult to replace with vocal queries. Many searches simply require a visual trust factor, and Google knows it. This is why videos and images are prominent for queries with transactional intent. Would you book an Airbnb on voice alone? It’s highly unlikely. Would you reorder dishwasher detergent? That’s a much higher possibility.
VoiceLabs predicts there will be 24.5 million voice-enabled/voice-capable devices shipped before the end of 2017, leading to a total device footprint of 33 million voice-first devices in circulation. With Amazon’s Alexa taking up ~75% of the market share, it shows that reordering dish detergent is a great example of how the market is using voice search right now.
How does voice affect search today?
Voice search always has an answer for local-based queries. Though local searches are clearly a strong point, I don’t see any activity in my Google Home activity feed, (which you can see above). The feed shows a limited activity of my voice searches alongside tips, tutorials and what’s playing on Chromecast. When I ask “Ok, Google, find me a place to stay in Littleton, Colorado on AirBNB”, the Google Home will quickly read aloud three addresses, unfortunately, with no path to resolution. Not a single activity is shown in the Google Home app activity feed. Something that I think will change with growth.
Today, if you have a page that is currently being used as a Featured Snippet in Google Search your website is eligible to be mentioned for voice search with Google Assistant.
When using Google Home, the Activity Feed on the app will provide a link to the website for any Featured Snippet voice queries that were triggered by voice, consistently (see below). Traffic arriving from this link is identified as Organic from google, with the keyword (not set), (naturally).
While this is great for now, it was recently reported that the Knowledge Graph was swallowing up Featured Snippets. So what now? How long until they are gone entirely? Why won’t you let us optimize for you?!
Let’s look ahead, after voice search has become less of a novelty, and has become second nature to us all.
Learning to love the bomb…
How Voice Search Works
Humans are largely visual creatures. Visualization works from a human perspective because we respond to and process visual data better than any other type of data. In fact, this is how Google found themselves refining sound identification. When the Sound Understanding Team at Google Research were building their AudioSet ontology, they found that some sounds were very difficult to define.
To remedy this, the Sound Understanding Team utilized the extremely vast knowledge of visual data Google has amassed and turned a sound recognition problem into a visual problem using spectrograms. This helped the Sound Understanding Team to classify different weights to measurements in the visualization (using frequency and time) to improve their accuracy in determining the source of any given sound.
When you talk to Google Home/Assistant, the spectrogram of what you’ve said is divided and sent to different computers housed in Google’s servers across the world, which is then processed using the neural network models built by the Google Research team.
With this in mind, it’s safe to assume that our voice searches are a contributing factor to Machine Perception development, an initiative dedicated to teaching machines how to accurately perceive audio by building state-of-the-art machine learning models, generating large-scale datasets of audio events, and defining the hierarchical relationships that exist between sounds.
By building this rich audio learning aspect and determining sounds of an environment it can help to teach Google’s neural network about the source, location and even the noteworthy context of the sounds. Context is the magic word here.
Using natural language processing, Google learns characteristics about how we speak (accents, pronunciations, colloquialisms) and how we behave in search as a consumer. They use this information to build a semantic understanding of us as an individual and our browsing interests. Context.
RankBrain interprets 450 million unique, long-tail, and ambiguous searches each day and identifies their similarities, to retrieve the most useful pages for the user. While voice search may not be directly affecting your customers today, it will in the future. The algorithm is always growing, learning and changing. Voice queries will almost certainly shape search as we know it.
Creating true artificial intelligence as most people understand it – a machine that can learn and think for itself – is the goal.
– Google Co-Founder, Sergey Brin
…and stop worrying.
Google is building a brain. Voice is just one of the senses Google is working with, and it’s going to be around for awhile, just like Google Lens, a visual search effort which relies on Schema markup being set up for the entities it identifies. I’m not sure I can say the same for Google’s other senses, such as smell. But I digress. My point here is, even if your customers aren’t using voice search today, the data that drives it is the fiber of the same algorithm that we all optimize for. Conversational queries will shape search and we need to be prepared for that.
Quality matters. Google needs people to rely on using their search engine to complete their tasks, no matter what that is. The bar is higher and is continually being raised. Google has high standards for the information that our pages deliver, and after all, they don’t owe us a thing. They provide a helpful resource for anyone to use and they get to decide what is shown on their platform.
Let’s recap what this all means for us moving forward. In layman’s terms, Google is using audio to help RankBrain contextually understand the world more perceptually. This could eventually lead to devices in our day-to-day lives that are more proactive than reactive to audio prompts. “Ok Google, did my oven timer go off?”, “Ok Google, did someone ring my doorbell?”, “Ok Google, which room did you hear my dogs barking in?”. Realistic scenarios, all of which take the additional hardware out of the equation. A very enticing angle for Google to put forward as a user benefit.
In order for that to happen your Google Home devices need to be listening at all times, which in order for them to hear the hotword “Ok, Google”, they clearly already are. However, by offering such features, public consent opens the doorway for Google to listen in and use the contextual conversations around “Ok, Google” prompts in a more culturally acceptable way.
Google is in the information business
For us, technology is not about the devices or products we build, those aren’t the end goals. Technology is a democratizing force, empowering people through information.
– Google CEO Sundar Pichai.
As we all know, Google is in the information business. They want as much data in as many ways as possible. With the Google Chromecast, Android in smart watches and TVs, Nest cameras, thermostats and now home security systems, it’s clear to see that Google wants to be extremely integrated into every minute of our day. All of this hardware is a shoe in the door to make Google part of your day to day life to the point where it will be too disruptive for you to remove it.
Not every product released however has staying power. Take Amazon Key for example, a service that was met with great criticism, and allows your Amazon delivery driver to leave packages inside your home. For Google, the public did not look at Glass with rose colors, more so with an eyeful of skepticism. Ultimately it’s the market that dictates when they’re ready to adopt, and sometimes these products and services simply arrive too early.
We interact with brands every single day, toothpaste for example, do you even see a brand when you open the drawer, or do you just see your toothpaste. Google has been extremely successful in becoming synonymous with search, and now they want to expand their synonym portfolio.
To be intensely involved with the human experience, Google needs to learn about your habits, understanding what you want, before you do: that’s the endgame. That’s why understanding our conversations before and after we search matters. Of course, Google has a fantastic understanding of intent with traditional text-based search, but our antiquated “keyword” ways of searching aren’t going to help grow intelligent AI.
We began searching in a very unnatural rudimentary way: “Men’s dress shoes black”. With machine learning, Google is building a brain that thinks, talks and understands way beyond 3 or 4 keywords as told by Fred Flintstone. Google wants us to communicate with search as naturally as we would with another person, but they got so good at delivering what we wanted that there’s no reason to take the effort to type full sentences into Search. To be truly conversational, it had to be with voice.
This is why understanding the contextual language around an “Ok, Google” prompt could definitely be in the crosshairs. Compare it to a TV show or movie where the characters from Earth meet an alien from another planet. In the first few episodes, they are trying to teach it English, poking, prodding and seeing what it can do. Then a few seasons later the alien is throwing together sentences, understanding human semantic conversation. That’s where we are now. The question is, what’s going to happen on the next season of Robot Overlords?
This information is all fine and dandy, but we’re Search marketers, how about some actionable strategies? Spoiler: It’s nothing you haven’t heard before, but if you aren’t doing it, consider this your final notice:
Build 10x Content
Rand Fishkin hit the nail on the head with 10x content. Going beyond ‘building great content’ and making something ten times better than anyone else has done before. Rand curated an ever-growing list of examples to show people what that looks like. It’s important to realize that when someone is visiting your site they have a question. Help them answer it, and then answer their next question after that. Build 10x loyalty.
Local? Claim Your Listings and Get Reviews
Voice search is massive for local. If you’re a business that provides local services it is crucial to get your business on Google My Business. Make sure the information you provide is accurate and full of great reviews.
Google recently announced that they will directly suggest prescreened providers; so far that includes HomeAdvisor and Porch. If you’re in a city that doesn’t have any available guaranteed or screened providers, you’ll still get the top three suggested nearby results, and as marketers, you will want to make sure that you’re one of them. My wife and I tried this feature recently and we chose the one with the best reviews. Now we have a new favorite pizza place.
Notice, in the image above, how Google is suggesting a list of recommended plumbers? Google could very well do the same for more queries in the future, providing users with a personalized feed of results. If Google can’t understand your business’ location information, you won’t be on those lists.
Write How You Speak
Remember, your website is for humans. Yes, it needs to be accessible for search engines to crawl, but we also need to remember that engagement signals are of prime importance. Write content for humans; talk like one throughout your content. Think longer-form, conversational.
Questions
Voice search is rewiring us to ask questions. By using Answer The Public, you can get a greater understanding of the questions people are asking around your target keywords, and start answering them.
Featured Snippets
As we pointed out earlier, featured snippets appear to be fluctuating notably in the SERPs. But the output shouldn’t change impetus behind them. Featured snippets teach us to be succinct, detailed and factual with our content. They also encourage us to structure our content in such a way that it’s easy for Google to extract and deliver in one chunk that provides an answer. Keep doing that. For now, at least we can receive direct organic traffic benefits from the Google Home app activity feed itself to pages that are eligible for featured snippet queries.
Get Mobile-Friendly
It should go without saying that being mobile-friendly is of great importance. In many countries, smartphone traffic now exceeds desktop traffic. Google provide tools such as the Mobile-Friendly Test and Test Your Mobile Speed to help you find areas of improving.
Structured Data
With the help of structured data using the Schema.org vocabulary we can help Google understand our content very specifically. With a query for “how many internet users are there in America”, Google generates a rich statistical answer card, using the data that was built on a page from World Bank. Though we don’t see these queries in our Google Home activity feed at this time, it’s very feasible that they could. Think back to the chart at the top of this article that showed how 31% of teens use voice search for “help with homework”. A student that is researching for a paper and asking these questions as a way to allow for a deeper dive, later on, is a realistic scenario.
Something to also pay close attention to is the schema for “speakable“. While this markup type is currently pending, it does speak directly to ways of optimizing for voice search. The markup would aim to focus on highlighting particularly ‘speakable’ content, in the sense of being especially appropriate for text-to-speech conversion.
Listen with curiosity. Speak with honesty. Act with integrity.
A fitting quote from author Roy T. Bennett, which in completion reads as follows:
The greatest problem with communication is we don’t listen to understand. We listen to reply. When we listen with curiosity, we don’t listen with the intent to reply. We listen for what’s behind the words.
A better voice search experience doesn’t mean a worse text-based search experience, but it will almost certainly influence it. Google voice search is shaping RankBrain, which in turn will shape how search reacts in the future. Your content needs to rapidly address the information it provides, to give the answers that your customers need. Voice could ultimately alter the way that search is presented to the end user. Location already does this, so why can’t the contextual understanding of something that users mention in passing also influence the results later on?
I’ll end on this foretelling quote from Google Co-Founder Sergey Brin:
My vision when we started Google was that eventually, you wouldn’t have to have a search query at all. You’d just have information come to you as you needed it.