Conductor’s SEO Strategist, Rory Truesdale, took to the stage to reveal that with the power of Python and a little bit of patience, scraping the SERPs and manipulating that data is a surefire way to uncover what your customers are most interested in and 10x your marketing strategy.
Read on for an overview of his talk and check out the slides for a step-by-step tutorial (complete with dropbox and resource links, because not everyone knows Python!).
Speaker’s name: Rory Truesdale
Job role and company: SEO Strategist at Conductor
Links to Twitter profile: @RoryT11
Link to LinkedIn profile: https://www.linkedin.com/in/rory-truesdale-18061324/
Link to the slides: https://www.slideshare.net/RoryTruesdale/brightonseo-2019-mining-the-serp-for-seo-content-customer-insights
What was the talk about?
Rory eloquently took us through how the SERPs are a great resource to learn what Google “thinks” our customers want. Turns out Google rewrites the SERP displayed meta description a whipping 84% of the time to try and serve users best.
But with Python, we have the power to deconstruct and analyse the language in the SERP displayed content to learn what Google thinks our users are interested in. An essential consideration in the age of search intent.
Once you’ve scraped the SERP (check the slides to find out exactly how to do this) you need to tidy your data by:
- Switching to lowercase and removing punctuation
- Removing stop words
- Tokenization (the process of chopping up a sentence into individual pieces)
- Lemmatization (the process of converting a word to its root)
He also delved into several analysis workflows that will help us understand the search intent of the people we want to reach. Including:
- Co-occurrence: How many times does a word or combination of words appear
- Part of speech tagging: What are the most frequently occurring nouns, verbs and adjectives?
- Topic modelling: Can we use natural language processing to uncover topical trends in the SERPs
And finally, and most importantly, he explained how to use these insights to improve the quality of onsite optimisation. In short, target the keyword but optimise for user intent, query context, topical relevance and word relationships.
“We are in the age of semantic search… Google isn’t ranking a page based on how it uses a keyword. Google provides accurate results based on intent, query context and word relationships.” – Rory Truesdale
Potential impact on the industry
With some minor tweaks, we can make our scripts work across a huge corpus of user-centric content. Plus, there’s the potential to ramp up and apply sentiment analysis to a range of sources, such as product and Google My Business reviews, for useful visualisations and identify what really matters to customers.
Ultimately, scraping the SERPs in this way offers a significant amount of insight into what our customers’ search intent really is and offering businesses a more competitive edge.
With co-occurrence analysis, we can use these results as an additional source of data for keyword research, identify topical content gaps on landing pages and optimise landing page content by incorporating semantically relevant phrases.
With part of speech tagging, we can create landing pages that are aligned with the intent of the searcher, help copywriters understand the language and the desires of a target audience, and tactically incorporate more semantically relevant phrases into landing pages.
And with topic modelling, we finally have a valuable data point to reference for content ideation, it will inform internal linking and content recommendations across a website, and we’ll have the power to incorporate topically relevant phrases into existing pages to improve semantic relevance.
- SERPs give us amazing insight into what customers want
- Python makes getting these insights at scale accessible
- Use these insights to align landing pages with intent and semantic value
- Scripts we create allow us to get these insights from lots of other user-centric sources beyond the SERPs