(Editorial note: I originally wrote this post over on the Hit Subscribe blog. I’ll be cross-posting anything I think this audience might find interesting and also started a SubStack to which I’ll syndicate marketing-related content.)
In the last few months, I’ve been burning the midnight oil. By day, I run Hit Subscribe. But by night and weekend, I’ve buried myself in a sea of client analytics data, building out our content performance monitoring alpha offering.
I say this not to complain. After years of pretending to know what I’m doing in marketing, it’s fun to return to my roots of pretending to know what I’m doing in software engineering. I mention the midnight oil because it has served as fuel for deep, interesting insights into refreshing content and traffic recovery.
And today I want to offer up those insights to you in the form of a clear, actionable traffic recovery playbook.
Setting the Scene: What Happened to Our Traffic?
Imagine that you’re responsible for content on your site and your organic traffic graph looks like this.
Sooner or later, you’re going to have an uncomfortable conversation about how and why you’ve presided over a 40% traffic decline. From my outsider’s perspective, this most commonly occurs following an acquisition or perhaps a change in leadership. After obligatory pleasantries, one of your first professional encounters is explaining this graph.
There’s a pretty good chance that you have many valid reasons. Someone cut the content budget. A staff writer quit and the backfill took forever. The recent site redesign performs as well as a walrus on a unicycle.
But even as you say these things, they’ll sound like excuses to you. And they’ll absolutely sound like excuses to the other party.
So here’s what you say instead:
I can do a detailed postmortem write-up on the traffic performance if you want. But if you’re interested, I have an actionable plan for how to recover the traffic and I can show you that.
These two sentences will absolutely and completely reset the conversation. You just need to be able to deliver on the actionable plan. And that’s what I’m going to hand to you in the following sections, drawing on our now-unfair advantage of tons and tons of data.
Traffic Recovery in Five Choreographed Steps
To turn “down and to the right” into “up and to the right,” here are five things that you should do in parallel, in priority order.
- Fix acute, technical problems afflicting traffic-earning URLs.
- Execute touch-up refreshes on slowly declining URLs.
- Plan and execute comprehensive refreshes on underperforming URLs.
- Consolidate and de-cannibalize cannibalizing URLs.
- Plan and execute new content.
And here’s what that looks like in spreadsheet form, with more detail.
Here’s a quick legend for the columns in the sheet.
- URL signal is what you’re observing in terms of the URL’s traffic pattern.
- Effort is the relative amount of combined person-hours that goes into each of these.
- Skills are the skillsets you’ll need to bring to bear to properly execute the tactic.
- Who is a loose capture of the role of the person required.
- Traffic payoff is when you can expect results. I’m basing this on a lot of empirical measurement at this point.
- Time to rank is how long it takes new content to rank in the upper half of target SERPs (and thus earn significant traffic). The last three tactics all vary based on time to rank.
If you’ve never measured time to rank, you can ballpark it based on your site’s domain authority. That’s in tab 2 of the sheet, but here’s what it looks like:
Prerequisites: What You Need
Before going through each of these steps in detail, let’s look briefly at what you need in order to make this happen.
In terms of skill sets, you’ll need to have at your disposal someone with decent working knowledge of technical SEO (in the event of acute problems), someone savvy about keyword research and ranking, and a subject matter expert (SME) in whatever your content is about. Depending on whether there are acute problems and what those are, you also might want your web developer handy.
In terms of tooling, you really just need some kind of analytics installed on your site. The dashboarding I’m building acts (by design) as an easy-mode cheat code for this, but you can make it happen with basic analytics and some determination. You just need the ability to examine traffic to your individual URLs over time.
I wouldn’t rely on an external tool for traffic data. Something like Ahrefs is handy for analyzing trends, but it’s too imprecise here, where the details matter. A tool like that, with SEO audit functionality built in, might help you with any acute technical SEO triage.
Executing the Recovery Plan
Now on to execution. I would suggest doing all of these things in parallel, if you have the bandwidth. But I have prioritized them in order of what will bring the most traffic the fastest.
(I realize “acute problems” and “touch-up refresh” don’t strictly follow that order, but acute problems can sandbag all of the other items, so you want to do those first if they exist.)
1. Fix Acute Problems
First and foremost, you should fix acute problems with your site. These can, of course, take the form of shotgun-foot issues like your site being down or changing your entire URL scheme without redirects. But it often takes subtler forms and sometimes only happens to one or a handful of URLs.
If you’re looking at a URL’s traffic and see something like this, you have an acute problem on your hands.
As you inventory your URLs, you might find yourself horrified to discover occasional flatlines that have lasted indefinitely because you, say, accidentally re-saved it as a draft. Relax, it happens to way more sites than just yours. If you fix the problem, you’ll find that your position in search is more durable than you’d think.
Cataloging and explaining all of the ways you can set a URL’s traffic on fire is out of scope for this post. But go through your site looking for URLs that have declined very sharply in traffic, figure out why that happened, and fix it.
Bounceback tends to happen quickly.
2. Execute Touch-Up Refreshes
Next up, identify candidates for touch-up refresh. A lot of SEO folks will recommend refreshing content once every six months or when it loses position for its primary keyword. There’s nothing wrong with that—it’s, in fact, how we used to recommend thinking about refreshes.
But these days we have a finer-turned early detector. We look for a gradual, non-seasonal decline pattern like this one that started in September.
Before a URL starts to meaningfully lose ground on its primary keyword, it suffers gentler attrition on longer tail keywords. Usually when you execute a refresh during this time, it recovers on those longer tails and cements itself for the primary for a while.
The payback tends to come nearly immediately. Here’s a graph of the traffic to twelve URLs in a collective state of decline as of June, with a red mark where a mass-refresh took place. The result was a roughly 60% collective traffic increase in the next month.
This is also one of the easiest imaginable interventions. I’m sure every SEO tool on earth has a refresh guide on their site that you can follow and do pretty well. When you’re at this light stage of decline, all you really need to do is make sure the text is scannable and the info is current.
3. Execute Comprehensive Refresh
Comprehensive refreshes, on the other hand, are somewhat more involved. You’ll need to deploy this tactic for URLs that have declined for a long time and earn almost no traffic. Or you might need to use this tactic for URLs that have never earned traffic but target a high-volume keyword.
And for these, you’ll need more elbow grease than just a touch-up. When we do these, we first assess winnability of the keyword and get client input on risk appetite. Assuming we want to take a swing at it, we then use Positional to conduct a gap analysis on the target URL, seeing what it lacks compared to URLs that rank.
Here’s a quick glance at what that looks like on our content lab site, where we can see relative word count, questions answered, and more.
In this case, I think the underperformance is down to having a gen-AI app spit out what appears to be some extremely verbose text and simply pasting it as-is. (This was an experiment I ran last year.) But in most cases, you’d likely see an underperformer be light on word count, headers, and searcher questions answered. We’d take this information, turn it into a refresh brief, and execute that brief.
This approach requires enough SEO knowledge to conduct the gap analysis and the subject matter expertise of an author to address the content gaps. You can generally expect traffic recovery in roughly half the time new content takes to rank.
Here’s a recent example with traffic to a handful of underperforming URLs in aggregate.
This is a typical graph here. Very low traffic (hence underperformer) with a slight, then large, lift coming in a couple of months.
4. De-Cannibalization
The topic of cannibalization and fixing it really deserves its own post. But to summarize very briefly, cannibalization occurs when you create a number of URLs that all more or less target the same keyword. There are two common scenarios where this happens:
- A company spends years journal-blogging about (and indexing) whatever is on their mind that week and then only later decide to try for organic search.
- An enterprise outfit creates similar content after acquisitions or content silos with their own separate subdomains.
This also tends to be hard to recognize at the individual-URL-level in a vacuum. If you sell widgets, you’ll mainly recognize this by saying, “Wow, we’re experts in widgets, but none of our posts about widgets earn any search traffic and we don’t rank for ‘widgets’ anywhere.”
This is also not for the faint of heart. You’ll likely need to remove or noindex some posts, consolidate others, and treat this as a project. It requires heavy SEO confidence and subject matter expertise, and it takes longer to pay off.
That said, the payoff can be substantial.
With de-cannibalization, however, the significance can extend beyond gross traffic figures. It can result in winning keywords that have bedeviled you for a long time. And that also will probably look like a whole separate category of win to new leadership.
5. New Content
The last piece of this puzzle is, of course, new content. Refreshing content can serve as a powerful booster for your traffic, especially if you don’t have much history of doing it previously. But it is ultimately no substitute for new content if you want to grow or even sustain.
Your organic traffic to a URL will decay after 12–18 months, typically at a rate between 1% and 4%. Without intervention, it winds up closer to that 4% figure, but even with intervention, on a long timeline, it still decays. SERPs get more crowded and topical interest declines. (More on traffic declines here, if you’re interested).
New content is your antidote to that.
So while you’re working hard on the refresh front, make sure you’re identifying and targeting winnable keywords with new content. That has the longest payoff period for traffic, typically taking on a hockey-stick shape.
Finding good keywords is generally fairly easy—easier than gap analysis and de-cannibalization. And executing content is fairly easy for an SME. But new content does tend to bog down in planning, ideation, and execution because of its optical importance to most organizations.
Setting Expectations and Looking for Success
To lay out this plan in earnest, look at the traffic patterns for all of your indexed URLs. Then, based on those patterns, put them into the buckets above, if applicable. (Some won’t target organic, others will be new or growing, so this only applies to established ones with traffic and underperformers.)
If you then pursue all of these tactics in parallel, you’ll realize results that should please the person worried about decline. Your efforts will bear fruit immediately and then also see substantial lifts kick in at a few different points between today and when your new content starts to help.
To really drive this home, however, you need to capture some baselines, so that you can report in the way I have in this post. With the tooling I’ve built, it’s trivial for me, but it probably won’t be for you.
Tabulate the traffic data to each bucket of content before and after you execute the tactic. All kinds of wild stuff can impact your site’s overall and organic traffic, so you need to capture the before and after of the specific URLs that you’re working on. Otherwise, your site’s traffic might look flat because all of your organic gains are offset by the decision to start targeting brand keywords with PPC or something.
To put it succinctly, take before photos so that the after photos mean something.
Hopefully you take this plan and recover yourself a lot of traffic. But I also hope that simply having it in your back pocket and presenting it wins you some brownie points and saves you some stress.
Source link
lol