Google’s Recent Market Share Drop Will Bring Us Closer to the Perfect Search Engine
I recently wrote on SearchEngineLand that after many years of stability, Google’s share of the search market dipped a few percentage points, according to theOctober Comscore’s report on desktop search market rankings (Note, the Google search market share number has remained at the current level since the October decline). Like many digital marketers, I don’t completely trust the Comscore rankings since I have always seen a different breakdown of search engine traffic in my own logs. To understand if Google’s market share truly decreased or if this was just a Comscore blip, I ran a survey on SurveyMonkey Audience.
In my survey, I asked people to share their primary desktop search engine, and here are the results:
I ran this same survey one year ago, and according to my survey, Google had 80% of the market. Based on the most recent results, it would seem to be that Google is actually losing market share– and it’s not just Comscore’s reporting methodology. While a decline in market share for the first time in many years might not seem like it bodes well for Google and its shareholders, I actually think this is the best thing ever for Google and the whole Internet.
Keeping Google honest
Generally, Google runs a search engine that for the most part does what users need as evidenced by their market share. However, they make constant search improvements to bring Google closer to their ultimate goal of achieving Larry Page’s vision of a “perfect search engine,” which he described as “something that understands exactly what you mean and gives you back exactly what you want.”
Background of Google enhancements
While Google claims to make thousands of tweaks to their core website ranking algorithm every year, they only make a small amount of updates that fundamentally transform the Google search results. Some of the really big updates merit blog posts on the Google webmaster blog and receive official names (Kind of like hurricanes.)
In 2011, the Panda algorithm targeted low quality content that plagued the Google search result pages. This update was massive, and according to Google, it impacted more than 10% of all queries. Like every piece of Google’s search algorithm, they never detailed exactly what it looked for, but many people did notice that it did a pretty good job of removing the kinds of results that seemed artificial.
In 2012, Google released the Penguin algorithm, which was aimed at neutralizing websites that took advantage of artificial links to increase their websites’ rankings. Links pointing to a website have always been an important part of how Google determines a website’s ranking; and it was one of the key features that Google differentiated themselves from competitors and directories during the late 1990’s. Rather than earn links naturally, webmasters skirted the process by using a variety of means including exchanges of cash to acquire links. Again, Google never detailed exactly how this algorithm works, but it is meant to determine whether links have been earned “naturally” or “unnaturally” (as defined by Google of course).
Mobile friendliness, Hummingbird and Rank Brain
In addition, there have been a series of recent updates aimed at improving search results on mobile. One update specifically demoted websites on mobile search that were deemed not “mobile friendly”. Others like “Hummingbird” improved search, so people can get results faster, sometimes without even clicking a website result.
Most recently, Google launched an algorithm update they dubbed “Rank Brain“. Rank Brain uses artificial intelligence to interpret never before seen queries and serve up the best results for the searcher based on what Google thinks they are seeking.
Competition as a motivation
Constant improvements of their search engine, with the occasional rewrite of their core algorithms, is expensive, and the exercise isn’t just one of altruism. Without the pressure of competitors biting at their heels, they might be less motivated to create the perfect search engine.
International markets are a perfect test of a more competitor-free environment, and notably even some of their big algorithm updates take a bit of time to migrate out of the US. In Singapore, where I am currently living and working, Google has almost complete dominance. As an English-speaking country with near total penetration of the Internet, Google’s search results should be as “perfect” as they are in the US, but they are not.
Singapore as an example
I was recently searching on Google.com.sg for a baby bassinet for my newborn. The first result is for the US based Toysrus.com website which does not even allow for shipping to Singapore. Imagine if this were to happen in the US, where the results for a Canadian or Mexican website that does not offer shipping to the US showeed up in US results. Users would scamper off to Bing or Yahoo or DuckDuckGo faster than a blink of an eye.
Here’s another troubling result. I searched for “recommended immunizations” on Google.com.sg, and Google’s entire first page is all US based results. Immunization recommendations are a local search, and each country will have differing schedules and recommendations. Public health experts would be up in arms if another country’s results showed up on a Google search for this important query in the US.
On this example, I am being a bit unfair since in Singapore they use UK spelling and immunization is actually spelled as “immunisation.” Even searching with the UK spelling still puts the US results higher than the Singapore results. Aside from the obvious problem of not having a top result for a Singapore website, this difference in spelling is the exact kind of thing that Rank Brain supposedly catches, and it does in the US. (see below)
These are just two very light examples in the differences in the way Google search is vastly better within the US than it is overseas. (For further research, try Googling in any language and in a non-US Google, and you are bound to see lower quality results and websites).
Why is the drop in US market share a good thing?
If Google were to deploy all of their vast search algorithm improvements at the same scale outside of the US that they do in the US, the quality of the non-US search results would be greatly improved. There wouldn’t be websites with top rankings that use linking tactics that have been blacklisted by Google, websites that are able to rank by stuffing keywords, and pages built purely on scraping other people’s content. But, to make these improvements would be expensive, and since nearly all non-US Internet users use Google, why should they bother?
The competition in the US forces Google to continuously iterate on their search algorithms and get closer to the perfect search engine they desire. The slip in market share that they are experiencing will be the motivating factor that spurs them on to even greater improvements. These enhancements will help users to search even better, and ultimately that is good for Google and everyone on the internet