SEO Tools

Every trade has its tools and is SEO is no different. There are more choices than ever for ways to spend money to help you improve your SEO performance, especially for large companies. It is time consuming to research and try different options. Once you've selected an SEO analytics tool, it becomes expensive to switch because you've likely put lots of time into training yourself and a team to use the many features to ensure you're getting value from the tool.

As an in-house SEO leader, I evaluated many options. At times I had the good fortune of a dedicated engineering team for SEO, so I was able to build custom tools to serve my team. But even then, it didn't make sense to try to build every possible tool to collect, process and manage the relevant data from which to draw actionable insights. And there's a maintenance cost to consider. ... So I've landed on the following tools that have a range of features and are purpose-built for enterprises managing large sites and/or multiple domains.

SEO Clarity - This is really an SEO analytics platform with many tools. If you have a team, it can be powerful since it is relatively easy to transition from analysis and insight into operational mode because of the ability to create and assign tasks. My primary uses are topical research, competitive research, monitoring keyword / topic groups (and the ranking domains and URLs within those groups), monitoring page-type performance, and individual page and keyword analysis. 

One of the features that makes SEO Clarity powerful is its ability to integrate with your web analytics, Google Search Console and Majestic accounts. This increases efficiency by joining data that you would have had to collect and combine yourself.

Botify - Where SEO Clarity covers a lot of ground, Botify is focused on crawling your website and making it easier (and faster) to uncover technical or structural issues that may be keeping a site from achieving optimal SEO performance. I've used Botify to crawl sites with up millions of pages. Since it was built by people who come from an SEO background, many of the common problems you'd be trying to identify are already broken out into their own reports with basic visualizations. 

With powerful filtering capabilities, it's very easy to export examples to share with developers or product managers who may be implementing site changes that result from such a crawl. Like SEO Clarity, Botify will integrate with your Google Analytics account so that you can better understand a site in relation to the traffic it's getting, how efficient the site is and so on. 

The super awesome feature is the ability to import and process your web log files so that you can compare Botify's crawl with how a search engine is crawling the site. While there are other ways to get at this data, if you are not that technical or don't have an engineer to help, this is the easiest way to get at some of the most important data for SEO. Again, Botify has some default reporting around frequency and distribution that are made more powerful if you create URL segmentations for the different parts of your site. Oh, and the UI is incredibly fast considering the massive amount of data that can be queried.

Screaming Frog is an alternate crawling tool I use on occasion, which also makes a lot of good data readily available, but which is run from your own machine (vs. Botify's cloud-based system).

Majestic - I mentioned this can be integrated with SEO Clarity, but even without that integration, Majestic is useful as a stand-alone tool. No link tool is comprehensive of the entire web, but Majestic does a good job. The ability to easily compare domains, their link profiles and gaps in your own link profile will continue to be valuable as long as links are a core part of search engines' algorithms.

While those are all paid tools (though they all have some free functionality), the two free tools I find indispensable are Google Search Console and Google Analytics. Every site needs web analytics and Google Analytics is free and easy to set up, so no need to explain there. Google Search Console, however, is often overlooked and even when people have set it up for a website, the data is often underutilized.

First and foremost, the Google organic search keywords and pages that get impressions and clicks - automatically mapped and segmentable by device type, date, search platform and country. This is powerful stuff, but you need to capture it. Fortunately there is an API and while it takes a little effort to get set up and using it, it's incredibly valuable to capture this data at the most granular level possible.

You can find plenty of Google Search Console support and documentation for a deeper understanding, but here are the other key features I continuously find valuable: crawl trend, crawl errors, internal and external links, and the ability to fetch and render particular URLs. More on those features another time.

Other handy things in the toolkit: 

  • the HttpFox extension for Firefox to check headers and response codes for individual web pages
  • the Web Developer extension for Firefox or Chrome for things such as viewing pages with Javascript and/or CSS disabled, for easily viewing a document outline, links and images on a page, or for changing your user agent to mimic a search engine bot
  • Chrome Dev Tools as another option for technical inspection
  • WebPageTest - a fantastic tool created by Patrick Meenan (now at Google) for getting into the nitty gritty of web page performance
  • TextWrangler - a text manipulation tool that's been handy for transforming and processing text, de-duplication, etc.
  • Microsoft Excel - of course, it's indispensable

There are other things I used from time to time, but these are my go-to toolkit. I'm happy to explain more ways to get value from these tools in detail. Just get in touch.

The Fragility of SEO

The Fragility of SEO

Search Experience Optimization