How To Refine Your SEO Strategy with Cora

16 October 2020

Posted in: SEO

Intrigued to know how Cora can further boost your SEO efforts? We asked Seeker founder and dab hand SEO, Gareth Simpson, to share his tips on how to make the most of this correlational analysis tool. You’ll learn how best to align a website with search terms, plus how to guide purposeful content creation to give it the competitive edge.

First, our CORA SEO story…

At Seeker, we instinctively optimise based on what we know is true of search engines. This is the fruits of our collective knowledge as an industry.

We also optimise based on what’s worked in the past for our websites, and what’s most appropriate to our client’s situation.

We work through a process of trial and error, monitoring rankings and user signals and making adjustments as we go.

After website strategy, this often starts with market and keyword research, aligning search terms with pages and ensuring it serves the users purpose.

We also gain insight from competitor analysis—discovering what works for them, deciding if it’s relevant, and then taking action.

We apply logical reasoning. For example, if Amazon or eBay is ranking for a term, despite lacking characteristics that we know will help with ranking, it’s likely due to the overall strength of the domain, page or one of many other signals.

As SEOs, we factor this into our analysis and eliminate these elements from our on-page recommendations.

We try to spot patterns ourselves, but we can only analyse a relatively small dataset manually—we’re only human! Still, it’s much less than a computation engine like Google.

Ultimately, to use a cliche, we focus on great content with guidance from SEO. 

We have a big content team, and our ratio of SEOs to writers is exceptionally high for an SEO agency—it’s the biggest team in the company.

Of course, we train professional copywriters on the nuances of SEO so that they apply it to their work as they go. And it seems to have worked well. 

In Seeker’s five years:

  • Our clients have grown from £3,000 in ecommerce revenue to £150,000.
  • We’ve grown from a one-man-band to a team of 34.

But I’m always curious about how we can continue to innovate. 

Four years ago, I was fascinated to discover the correlational tool Cora SEO. And I’ve continued to explore how this tool (and others) can help with on-page optimization ever since. 

Here, you can read what I’ve learned, and unpack how you might use Cora SEO to advance your own work.

A brief history of ranking factors

When I started working in the web industry 15 years ago, ranking was quite easy.

And that’s because the algorithm was simpler and much easier to understand.

It was so simple, the industry gamed the system for profit. 

While this helped many individuals and businesses rank, it degraded the user experience. Unfortunately, undeserving content could rank highly because the algorithm was more influenced by SEOs’ skills than the merits of the website.

So when the search engines upped their game, the industry evolved too. 

General correlational studies started to appear. The Moz study was the most prominent,  surveying leading professionals on what they perceived as the highest-ranking factors.

Whilst a subjective survey, the results still helped the SEO community focus their strategy on areas that mattered.

Then later, more advanced correlational studies can into play.

Typically, they involved scraping large datasets and measuring the characteristics that may affect ranking. This was then plotted against ranking positions to identify correlations and trends.

A few examples of advance correlational studies include:

Ahrefs Search Traffic Study

A favourite of mine (and possibly one of the largest), Ahrefs’ recent study looked at off-page factors. Ahrefs analysed two million random newly-published pages to find out how many ranked in Google’s top 10 search results—only 5.7%

Using a mixture of the data presented and our interpretations, we’ve applied rules learned from the study to help formulate our link building strategies. 

Brian Dean (Backlinko)

Then there was Brian Dean’s more recent study that looked to answer the question: which factors correlate with first-page search engine rankings? This uncovered findings such as the number 1 result in Google having an average of 3.8x more backlinks than positions 2-10.

Parts of the study were called out as being inaccurate. However, Brian himself noted that ranking factors are controversial and speculative. The less than transparent nature of the ranking factors means that these figures should be taken into consideration as food for thought, rather than absolute truths. 

But how accurate are these studies?

Some SEOs refute the validity of such correlational studies, suggesting that they’re too narrow, or don’t provide the complete picture. And good scientific studies are typically peer-reviewed, to critique the methods used.

It’s healthy to be wary—I’ve seen SEO’s claiming to run ‘single variable tests’. And with so many factors at play, it’s fair to ask: is that even possible?

What’s more, there may be a conflict of interest. For example, a brand may have a vested interested in particular software solutions, or an individual may want to increase their credibility to advance their personal branding. That’s not to say that this is always the case, but it’s good to take these studies with a pinch of salt. 

Seeker’s CᐧUᐧLᐧT of SEO

In our training at Seeker, and when educating clients to take a holistic approach to SEO, we loosely categorise ranking factors into 4 pillars, using the pneumonic CᐧUᐧLᐧT. We’ll dive deeper into the specifics in another blog post, but as a brief overview:

Content 

Content and relevance signals, both on- and off-page.

User

User signals such as brand sentiment, user experience and behaviour. This is affected by other performance factors such as site speed.

Links

Link and authority signals. We ultimately validate the content signals using authority and trust—things like link popularity, quality of links and brand equity.

Technical

And, last but not least, technical elements. Less subjective than the others, our role here is to facilitate search engines with indexing, crawling, site speed, security etc.

So there you have it:

Content User Links Technical
  • Relevancy of the content for the search query
  • User intent
  • User signals
  • Personalisation
  • Geography
  • Authority
  • Validation
  • Links
  • Shares
  • Search engine friendliness
  • Crawling
  • Indexing
  • Health of the website and code

Today’s ranking algorithms 

So, why is Seeker’s approach to ranking so broad and varied? 

Because information retrieval algorithms have become so advanced and complex, with many caveats, layers and ranking factors, that it’s almost impossible to reverse engineer these systems.

And the goalposts are always changing. Because websites are inherently different in their makeup, what works for one site may not work for another—the factors are relative to each other. And that’s one of the reasons why general correlation studies are criticised. 

Any conclusions are drawn from such narrow datasets that you can’t draw reliable conclusions. Even Google engineers have said they don’t fully understand what certain algorithms are doing.

Nonetheless, it’s useful to recap some of Google’s most pivotal algorithm updates, and their potential impact on rankings.

Rank Brain

RankBrain is a part of Google’s core algorithm which uses machine learning to determine the most relevant results to search engine queries. It’s thought that the query can apply factors like the searcher’s location, personalization, and the words of the query to determine their true intent. And this means that Google can deliver more relevant results.

BERT

Google’s next big update after RankBrain was BERT. With superior Natural Language Processing (NLP), BERT could understand more of the nuances of human language and further refine the results to increase relevancy to a particular query.

Quality Algorithms (e.g. Penguin & Panda)

While the importance of quality-focused Google updates like Penguin and Panda should be measured with caution, I suspect that the credibility of most sites can be demonstrated by EAT: Expertise, Authoritativeness and Trustworthiness.

Whilst many SEOs claim that EAT and the checks in Search Quality Rater Guidelines aren’t ranking signals, they provide useful insight into what’s valued in terms of site quality. And as we want to futureproof our sites, it’s wise to review the guidelines and implement what is feasible for your website if you have the budget.

Why use tools like Cora?

Cora aims to help you gain an edge over competing pages, and it does so by focusing on some of the CULT pillars I’ve just discussed. Here, I’ll specifically be looking at on-page content, search terms and relevancy.

What are our goals?

  • Optimise on what we perceive to be true (collectively, as an industry).
  • Optimise based on specifics to your, or a client’s, site.
  • Get more contextual by analysing who else ranks for these terms—effectively providing a GAP analysis and blueprint to further optimise your content.

How to group terms

At Seeker, we take a data-driven approach to on-page optimization, with lots of human reasoning and common sense applied. So we group terms by topic and intent, then use that information to direct the content hierarchy of pages. 

Terms fall under one of the following categories:

  • Domain
  • Sub-folder
  • Page
  • Section of a page
  • Mention in a sentence

By grouping the site terms in this way (facilitated by internal linking) we help the search engines to understand the relationship of pages and how they sit together. Of course, there are variations of structures, and search engines can understand the differences. This helps to ensure pages aren’t cannibalizing each other.

After we’ve logically broken up the content, we optimise the pages. 

And once there’s nothing else to optimise, we bring in tools including Cora SEO

How correlational tools work

Generally, correlational tools take a page and a search term, scrape a scope of results in Google and identify patterns in the elements we know to be important content signals.

At a basic level, these are things like:

  • Page titles
  • Headings
  • Terms in paragraphs

At a more advanced level, we’re looking at:

  • Salience
  • Sentiment

Whilst general correlational studies often focus on wide scopes and industries, tools such as Cora SEO are contextual. The idea behind tools like this:

Every Search Engine Results Page (SERP) is different > the rules are different > the weighting is different.

For example, when you’re searching for a flight, you behave differently to when you’re reading the news. So you can expect the British Airways website to behave very differently to The Guardian. Their marketing practices are different, and the makeup of their elements is different.

So if the algorithm behaves differently in different sectors, these tools aim to isolate those factors, providing recommendations based on maths rather than assumptions and opinions.

What is Cora?

Cora SEO is a comprehensive analysis tool—more so than many other popular choices such as SEO Powersuite, PageOptimiser Pro and Surfer.

Why Cora?

It’s software that’s aimed specifically at SEOs, which means that—to begin with, at least–it can be complex and overwhelming. But it also gives you more control, pulling out data for you to action in your own way.

Cora checks over 2000 elements and plots the correlation. So it’s not a blueprint that guarantees any site will rank, but it does eliminate particular elements as an uncertainty.

What’s coming in the future? 

So, what’s the next big thing in SEO? We all know that’s impossible to predict! But when Google can measure user behaviour more reliably, there’s bound to be a greater focus on user signals.

During the Google Medic update, one of our client’s saw their website rankings plummet, and we didn’t know what to do.

So we took a subjective approach, looking through the search quality guidelines and implementing everything we could. And we changed to an E-A-T (expertise, authority and trustworthiness) content marketing strategy to focus on links from high-quality resources, with lots more references, policies, studies and editorial policies.

And once that had all been implemented? 

We used Cora to identify any other gaps that we could focus on.

It’s how the site recovered and, ultimately, it allowed us to analyse what had worked well, and apply our learnings to future sites and projects. 

Can we pinpoint exactly what worked? Not specifically. 

But there was a lot of trial-and-error, and some of our decisions were influenced by Cora.

SEO will always require a holistic approach. But the most important thing is to keep testing.

Ultimately, all of the things we tested were good for the site, and good for the brand.

And Cora’s correlational analysis was an influential part of that.

Ready to push your SEO strategy to the next level? We wield a variety of SEO tools and expertise to help amplify your brand—get in touch with our SEO specialists to find out more.

Share This