Researching keywords is a huge part of working in the SEO industry, but do you really get indicative information through using the same old keyword tools? In this talk, Mark Osborne of Blue Array SEO contended that you need to venture down other routes to get quality data.
- Speaker’s name: Mark Osborne
- Job role and company: SEO Manager at Blue Array SEO
- Website: https://www.bluearray.co.uk/
- Twitter profile: @MarkSEOsborne
- LinkedIn profile: https://www.linkedin.com/in/mark-osborne-7a1a1764/
- Link to the slides:
What was the talk about?
Basic keyword data without any context isn’t enough to optimise the performance of your content. You need to get more granular with how you source and parse data. Using avenues including Amazon book previews, social media channels, and Quora pages, you can build a much more representative searcher intent model.
On the Google Cloud Natural Language API demo:
“It’s really great if you’re writing a piece and you want to know what Google thinks of that piece and how it categorizes it: you can put it in this tool and it will show you how it categorizes it.”
Potential impact on the industry
Standard SEO wisdom is based on the classic approach to keywords, but semantic search is increasingly rendering it unnecessary. If webmasters fully adopt a more organic approach to research, we might see the end of the attachment to finely-targeted keyword work.
I’m going to spend some time going back through your slides and looking at the templates you’ve kindly put together for us SEO’s to use.
Thanks once again pic.twitter.com/NiiRiSHcRK
— Dan Wiggins (@_danwiggins) April 15, 2019
- Tools like Ahrefs provide a lot of valuable information, but we need to get smarter with analyzing it.
- Twinword is halfway effective — Google’s Natural Language API is even better when combined with searcher intent models.
- You can use Screaming Frog’s SEO Spider along with XPath to filter SERPs into a spreadsheet for further analysis.