With each passing year, Google’s cash-cow search business is becoming more complex.
That’s not necessarily a bad thing, however. It’s more a reflection of how AI/machine learning advances and the demands of smartphone users have driven a rethink of both what Alphabet/Google (GOOGL) shows to search users, and how it lets businesses run ad campaigns.
Originally, Google Search primarily relied on an algorithm called PageRank: It ranks search results based on the number of links to a particular web page, along with the quality of those links. And its search ad business primarily relied on a separate algorithm (known as Ad Rank) that decided whether to run a text ad against a particular search term (keyword) based on how much the advertiser was bidding and how likely the algorithm felt the ad would be clicked on.
Today, PageRank is just one of many factors playing a role in what’s shown on a search results page. Over the years, Google has begun doing things such as analyzing the content of web pages to inform its search results, and (with the help of a system called RankBrain) using machine learning to figure out what results might be relevant to a particular query.
Google has also begun integrating content from its Knowledge Graph within search results, as well as information such as sports scores, weather reports and content from sites such as YouTube, Google News and Google Finance. And it has started showing a personalized content feed on the home page of its Google Search app. Many of these moves serve to make Google Search more mobile-friendly, given that many smartphone users prefer not to jump to another website or app to get what they need.
Meanwhile, though Google’s search ad business still depends heavily on Ad Rank, Google’s search pages now show not only text ads, but also paid listings from platforms such as Google Shopping and Google Hotel Finder, which both have their own algorithms. In addition, Google has begun using machine learning to help advertisers optimize their ad campaigns based on their budget and goals, as well as to help small businesses quickly launch ad campaigns.
This week, Google unveiled an ambitious overhaul of its search platform that suggest it’s more willing than ever to rethink what users see as machine learning algorithms get more effective and search traffic continues shifting towards mobile devices. Arguably the biggest changes are the following:
- Google is creating its own “Stories” product. However, unlike Facebook (FB) and Snap’s (SNAP) Stories products, which involve photo and video content shared by humans, Google’s offering will use AI to create Stories that appear alongside relevant search results. The first such Stories will be about “notable people” such as celebrities and athletes.
- Google plans to use AI to better understand the content of videos appearing within search results, and use that knowledge to show relevant videos within search results.
- Google’s image search feature will use AI (courtesy of Google Lens, which is already getting built into mobile camera apps) to understand what a photo is showing and provide relevant captions. Google is also making it easier to find shopping info for items spotted within photos.
- Google’s content feed is getting a revamp that (among other things) gives users more control over what content is shown, and which will result in more visual content appearing. Google is also bringing the feed, which has only been available in its Search app, to its mobile website.
- Search results pages will now sometimes show a “related activity” section that shows past searches made by a user that are related to the topic being searched, as well as relevant web pages that a user previously visited. Google says users can remove listings from their history, as well as fully disable the feature.
Some of these new features could yield fresh revenue opportunities. For example, Google could run ads against its AI-created Stories the same way that Facebook and Snap run ads against their Stories products. And just as Facebook and Twitter (TWTR) show ads within their content feeds, Google could now do the same.
However, much like the new search features shown off at the Google I/O conference earlier this year, as well as its efforts to integrate Google Assistant with Search, the bigger payoff for such moves could be in how they help Google Search remain a valuable resource to mobile users who may be frequently using several dozen or more apps. Moreover, some of these move could help Google grow the amount of time smartphone users spend each day on its search app and/or website.
There seems to be a good understanding here of the differences between PC and mobile search needs. On PCs, Google has by and large never prioritized the amount of time spent on its search website — rather, its goal has often been to quickly shuttle users to relevant third-party websites, and hope they occasionally do so by clicking on ads. But on mobile, Google realizes that there’s value in growing how much time users spend each day on its site and apps, and how frequently they choose to launch its app.
Together with YouTube’s revenue growth, Google’s success at growing and monetizing smartphone search activity is a big reason why its ad revenue from Google-owned sites and apps is still growing at a 20%-plus pace. The company’s efforts to use AI and personalization to create better mobile search experiences should help keep its search ad sales growing at a healthy clip.