For decades, web search engines have been our go-to tools for connecting us to the information we seek, but times they are a changing. Other means of information retrieval are emerging (ChatGPT for one) and developing at an astonishing rate. What happens next is open for debate, but it would be wise for B2B marketers to be ready for the change.
First, let’s take a look at how search engine optimisation has developed,
SEO – A Brief History
The origin of the phrase search engine optimisation is difficult to pin down, but the practice probably started in the late 1990s. To avoid going down too deep a rabbit hole, let’s simplify. The early versions of the search engines were relatively simple beasts and worked on keywords and backlinks.
It was possible for anyone with some knowledge to game these early search engines. Hence, several years of chaos followed until the search engine businesses started to get a grip on the situation.
In the late 1990s through early 2000s there was no great effort to digitise existing print content. So what did the search engines need as they grew? That’s right, something to search for. Hence, search engines rewarded (via improved visibility) those who produced high-quality content and delivered it online.
Businesses with an understanding of SEO (keywords and links), good quality content and a consistent publishing schedule could rank, be found and grow their business. They could raise awareness of their products or services and generate leads.
For a time online marketing worked, that is it worked organically with limited (if any) use of advertising. It’s important to note those who built authority during this period largely retain that authority. Latecomers to the party are at a distinct disadvantage.
The search engines (Google was the major player by this stage) became a victim of their success. They based their ranking process on links and content, but that allowed individuals and businesses to game the system.
As a result, the search engines were at risk of delivering a poor user experience. That makes online advertising less attractive, so something had to be done. In relatively quick succession (2011 to 2013), Google rolled out the Panda, Penguin and Hummingbird algorithm updates.
Penguin addressed dubious link schemes. Panda aimed to promote high-quality, relevant and original content – we will return to that below. Hummingbird was different, it aimed to understand intent and context – semantic search.
From late 2012 (knowledge graph) onwards Google has built on this model by incorporating Artificial Intelligence (RankBrain – 2015, BERT – 2019, Neural Matching – Ongoing) into their algorithms. This allows the algorithms to understand (to a point) the user’s query, the context and their intent. The algorithms can also understand (again – to a point) the relationship between things (entities).
How Ranking On Google Works Today – An Opinion
So, we have moved on from ranking based on keywords in content and backlinks to ranking algorithms that are much more sophisticated. Here is a (simplified) opinion on how ranking in Google works today.
First, remember this: Google’s purpose is to return a selection of results in response to a query, not necessarily the best results.
Google intends to deliver a few best answers rather than provide many pages of great possibilities. They provide a mass-use tool for the average surfer who wants to search and go rather than a research tool – Eric Ward from Ultimate guide to link building.
Google does not care about ranking your web page correctly. This is not their objective. Their systems are designed around providing the best SET of results to users for a given search query. – Eric Enge From a forum discussion with Michael Martinez.
So instead of competing for up to ten (organic) positions on page1 of the search results page, you are really competing for three or four.
All that follows assumes webpages have no technical SEO issues and allow crawling and indexing. Additionally, I assume, webpages deliver a positive user experience, including speed and layout.
At the top level, Google expects a site to be trustworthy and have authority. These two elements are closely linked. I will add a subset to authority, defined as popularity.
Some sites Google knows to be trustworthy (authoritative) regardless of other signals. For news outlets that might be the BBC. In electronic memory components, it might be Micron. In pharmaceuticals, it might be Pfizer. They are (as perceived by Google) trustworthy sites in their sector – end of story.
Beyond the obviously trustworthy sites, the search engines have some way of allocating a site a trust/authority score. This is a comparative measure (your site vs competitor) and is signal-driven. In the early 1990s, those signals were backlinks and keywords, but now it is much more than that.
There is much talk in the SEO community about EEAT (experience, expertise, authority, trust). Authority and trust are touched on above, but taking EEAT as a group, how would a search engine measure such a thing? Anyone can make whatever claims they wish on a webpage and it’s easy to fake reviews and testimonials.
One possible measure is what others say about a business or entity. Therefore, I believe Google has some way of measuring “popularity.”
At a simplistic level, this might be mentions of an entity on news or social sites. It could be measured via click-through ratio, social shares or time on site.
User experience (UX) might be an element of popularity and brand mentions could also have an impact. How is popularity measured, I don’t know, but elements such as CTR are too easy to game.
To a point, this is a circular argument as content quality impacts authority (and popularity), as we discuss in our “the purpose of a manufacturers blog” post
So too summarise:
- You will always be outranked by the long-established big players in any market and the major news outlets.
- If your market is highly competitive and/or your competitors were first to spot the opportunity and have a strong web presence, you will find it difficult to compete.
- “Popularity” is the new backlinks. In competitive markets, you will struggle without it.
Why is Authority/Trust (or EEAT, if there is such a thing) so important? Because I believe it is the first screening criteria.
Your webpage sits indexed with millions of others ready to service a user’s query/locale/intent. When that request comes in Google makes a first selection (first screen) of pages. If your page misses the cut you are out of the game. Any further measures further down the line that you could have passed with flying colours don’t matter.
In my view, a first-pass bias towards authority and trustworthiness skews the search results. It restricts good quality content on any website without that authority (or popularity) rising to the top. The lack of authority might be because it is a new website (or business) or for many other reasons.
Someone nobody has ever heard of sitting at a cramped desk in the corner of a bedroom comes up with the unifying theory of physics. He/She excitedly publishes it on their blog that covers random topics from gardening to astrophysics. Will it find its way to the top spot in Google SERPS if someone searches for a unifying theory of physics – Nope.
Of course, Google, and other search engines, are aware of this problem. They do have mechanisms to bring forward new content. I don’t believe, at this time, those mechanisms are very good.
The next elements to consider are intent and relevance. As discussed above, most of the recent algorithm changes implemented by Google relate to this issue.
Google (as the dominant player in online search) has moved on from keywords to understand the intent of the query. What is the searcher looking for? From there, what content from the vast amount available is most relevant to satisfy that search intent?
However, as described, many of the Google tools involved use deep learning techniques. That is relatively slow and expensive so, my guess is, it only comes into force at a second (or lower) screen. It might only be used at the final screening step.
Perhaps your content would have been a great fit when measured against intent/locale/quality. If only it had not fallen at the first fence.
Information (content) Delivery
So how does the above affect marketers? We could argue all day long about definitions and detail, but to try and keep things simple.
Before the internet came along, marketers were trying to create awareness of the firm and its product or service among a target audience. They were also trying to secure sales leads from the proportion of that audience in the market at any time. They were supporting existing customers, promoting retention and word of mouth. Those objectives (I suggest) have not changed.
If content does not show up on the first page of the SERPS, the chances of consumption by a target audience are minimal. Information needs to be set up (optimised) in some way to show up (rank), but there remains a barrier of authority/trust.
First, you need authority/trustworthiness as perceived by the search engines. Then you need exceptional content that is different to all the rest. That exceptional content is one (of many) pillars of popularity, that, in turn, is a key element of trustworthiness/authority.
Can you build authority/trust? Yes, you can, but simply adding more and more online content will not help. Here’s the interesting part: To succeed you will probably need to use age-old tactics like advertising and PR (mainly offline). This we will cover in more detail in a future blog post.
So What’s The Alternative?
Here’s a suggestion, first concentrate on existing customers. You don’t necessarily have to compete online to reach them.
Second, deliver information to existing sales leads and (possibly) prospects directly. Go back and review the sales tactics used pre-internet to achieve this goal.
Lastly, embrace what the sudden emergence of AI-driven technology brings (discussed in our information delivery challenges post).
In our humble opinion, the world is moving from an open (anything you could possibly want on the internet) to a closed world of communities and knowledge bases.
Potentially, that opens the field to the lesser-known (little online authority) business. Best to be ready for the change.