Position one for a query is no longer close to enough
I don't know of a single person in publishing who doesn't believe that large language models (LLMs) aren't going to have a profound impact on the industry. But most of the attention has been on using them to create content, something which many publishers see as a way of increasing efficiency (by which they usually mean reducing expensive headcount).
Whether that is actually possible or desirable is a topic for another time, but what I want to focus on is the other side of AI: what its adoption by Google is going to do to the traffic to publisher sites, and how we should be changing our content strategies to respond.
Google's large language models
It's worth starting by being clear about how Google is using LLMs. The company has two products which use large language models to deliver results for users. The first, and probably the best well-known, is Bard, which is similar to ChatGPT in that it uses a conversational interface where users ask questions or give prompts in natural language, and the programme responds.
The second – and the one which, I think, should be most concerning to publishers – is Search Generative Experience (SGE). SGE is currently in the experimental stage, but will ultimately deliver answers directly into the top of Google, generated by its large language model.
As you can see from the example, SGE takes up a lot of real estate in the query result, and delivers a complete answer based on what Google “knows”. Although it gives citations, there is no need to click on them if all you want is the answer to a query.
How this affects publishers
Obviously, anything which sits at the top of search results is going to impact on the amount of traffic which clicks through to publisher sites underneath. And this is potentially worse than anything we have seen before: if the answer to the query is given on Google's page, why would anyone bother to scroll down and click through?
This means the much-fought over positions one to three will be much less effective than every before, and there will be a big decline in publisher traffic.
The impact on different kinds of content
It is likely that some kinds of content will be impacted more than others. Answers to questions are an obvious one, and in 2017 they accounted for 8% of searches. That is likely to have grown already and grow still further as users get used to being able to ask machines questions and get good quality tailored answers.
But in its article on SGE, Google highlights a second area where publishers are likely to see a major impact: shopping. Many publishers have put significant effort into creating content focused on affiliate revenue, with some seeing affiliate overtaking advertising as a source of revenue. Affiliate content is almost always designed to capture traffic via search, for the simple reason that buying products usually starts with a Google search. An SGE-driven shopping search experience will ultimately bypass publishers and drive traffic direct to the retailer, with the AI making individually tailored recommendations on what to buy.
This threatens to be disastrous for publishers. Effectively, SGE delivers a one-two punch of reduced traffic as more search queries are answered on the results page, plus reduced traffic to and revenue from affiliate pages.
What publishers should do
SGE is currently in the experimental stage, which means publishers shouldn't see any significant impact for now. But there is a clear direction here: more answers to search queries will be delivered without any click-through to publishers. And product shopping queries are going to become something which Google channels to retailers (who, by complete coincidence, are also advertisers) rather than publishers (who, by and large, are not).
I estimate that publishers have a window of between three and five years to change content strategies to adapt to this new world, depending on the speed of user adoption. It could be faster: much will depend on how quickly Google's LLM work starts to move from an experiment to delivering real results.
The long-term answer for publishers is to reduce exposure to Google as a source of traffic. That's going to be tough: almost every site I have worked on relied on Google for between 60-90% of its traffic. And the more the site was focused on affiliate revenue and e-commerce, the higher that percentage was.
The answer is to focus on increasing your level of direct traffic, making your site a destination for content rather than something users hit once and bounce away from. Learn lessons from marketing: treat every piece of content you create as an opportunity to deepen your relationship with your audience.
There are five things I would recommend publishers start doing today:
Refocus your KPIs and OKRs to be about deepening relationships, not just traffic. Focus on repeat visits and sign-ups. Look to increase the number of qualified email addresses you have (and whatever you do, don't succumb to the temptation to capture more data. If you deliver value, you will capture more over time -- but all you need now is a person's email address).
Reevaluate your search strategy and focus on topics with complexity. The more complex the content, the higher its quality, the less likely it is that an LLM can deliver a good quality version of it. Expertise and depth will be essential, and complex topic areas might be the “last person standing” when it comes to Google searches which work for publishers.
If you have three to five year revenue forecasts, ramp affiliate revenue down over time rather than predicting growth. The era of affiliate revenue as a major contributor will be over. Use the revenue you are getting from it to bootstrap other areas.
Heavily invest in newsletters. And whatever you do, don't consider them to be a place for advertising. Nothing creeps users out more than thinking they are signing up for interesting content only to find it chock-full of ads or sponsored content.
Don't think that AI-generated content is going to “save” you. Many publishers are looking at content created by LLMs as a way of lowering costs. It will. But it will also put you out of business. Remember that any content you can create with an LLM can be done better by Google at the top of its results pages. What publishers have in their favour is human talent, creativity and expertise. The more you lose that by trying to use LLMs to cut costs, the smaller your competitive advantage.
Next week I will return to that last topic, and look at the mirage of LLM content and why it's a death-trap for publishers.