The Author
Berian Reed

Berian has 10+ years agency and client side experience, and specialises in the technical aspects of running successful online businesses. Berian is a regular speaker at global SEO events, and most recently presented at Brighton SEO.

Read more from Berian Reed

The Agency

Pure Optimisation is a digital marketing agency specialising in content strategy, PPC, social media and yup, you guessed it: SEO. We spoke with Berian Reed – Managing Director, to hear why his first taste of DeepCrawl wouldn’t be the last.

The Client

AutoTrader is the No. 1 trusted motoring marketplace for cars for sale in South Africa and has been in business for the past 23 years. Historically a print business, they have successfully migrated their business online and now receive over 4 million visits to their sites each month. They have a fully responsive mobile site as well as iOS and Android applications. Thousands of dealers and private sellers advertise their vehicles each month with this digital giant who list over 70, 000 vehicles on their site at any given time.

Following a prolonged period of SEO stagnation and decline AutoTrader instructed Pure Optimisation to perform an in depth SEO Audit. For a site with over 1 million indexable pages, DeepCrawl was the tool of choice for the On-Site SEO workstream.

The Challenge

The first (deep)crawl of AutoTrader identified several critical issues impacting SEO, site architecture and crawl efficiency. Here’s how Pure increased AutoTrader’s site visibility, and disrupted 2 years of steady decline by growth using DeepCrawl.

Before

With over 100 levels of content, the whole site needed restructuring to enable better indexation, improve content and increase their crawl budget efficiency.

As the market leader for 23+ years, AutoTrader needed to know what was behind the data in order to fix it, here’s how Pure Optimisation used DeepCrawl to do just that.
For starters, Pure set off a (deep)crawl to do a full technical site audit, and get a complete picture of AutoTrader’s website architecture.

The key areas of focus that DeepCrawl highlighted were as follows:

  • Site architecture:
    DeepCrawl found over 1.6 million pages in a subsequent crawl that was limited to 100 levels
  • Crawl efficiency:
    Having so many landing pages trailing at 100+ clicks in, showed just how deep into the website that content was buried
  • Usability:
    Users could potentially be spending a lot of time navigating the site to find the pages they were looking for, and may be likely to bounce off-site
  • Internal SEO Authority flow
    The 100+ levels of pages meant that SEO authority wasn’t being distributed to the pages that needed it

After

Fixing the issues

Site architecture and On-page SEO
AutoTrader’s re-architected website now has all content accessible within 12 clicks from the home page, with 99% of content accessed in 9 steps!

  • Improved content v HTML ratio
  • Page speed enhancements
  • Open Graph and Twitter markup
  • Improved internal linking

Running an integrated web and Google analytics crawl to audit AutoTrader’s site performance, yielded even more insights. Pure instantly discovered some of the main challenges. The biggest value was identifying the scalability of the site, which was a challenge, using tools like search console and other crawling tools alone. DeepCrawl found over 1.6 million pages in a subsequent crawl that was limited to 100 levels, showing the position the site was in. It became clear that the site was architectured in a way that was really inefficiently using up rather than making the most of, their crawl budget. Having so many landing pages trailing a 100+ clicks in, showed just how deep into their site that DeepCrawl found their content.

Users could potentially be spending a lot of time navigating the site to find the pages they were looking for, and may have been likely to bounce off-site. Moreover, bots may not have crawled the site as regularly. Just as pages with little or inactive stock aren’t of much use for users, bot-wise, thin pages also bring with them the fear of the Panda…These pages were fixed using the content to HTML ratio feature in DeepCrawl, to get poor content pages all fixed up and looking sharp.

The site not only needed the content on its pages enriched, but needed pages redesigned to enhance other under-optimised features. For every URL crawled DeepCrawl reports showed whether open graph and twitter cards were set up, and if so whether they were functioning correctly or not. This made optimising AutoTrader’s site for social media (for example) much, much easier. Whilst redesigning the site’s digital infrastructure, Pure tested out all the changes beforehand, crawling both the staging and live environments with DeepCrawl. Enabling the team to confidently say ‘we can press the Go button’, and sign off on on going live as part of the release process.

Key Features & Reports

Task Manager

Using DeepCrawl’s internal task manager, Pure worked with AutoTrader’s development team. To quote Rand Fishkin, we know that “what you measure, is what you’re able to improve on”. By focusing on the Indexation report and the Content report, Pure isolated how many pages were being indexed, as well as assess the location and number of indexable, low-value/thin content pages.

Pure assigned sets of URLs to the development team to be canonicalised, have their internal link structure optimised and updated. Applying meta-tags (like no-index) in the robots file to combat the sites over-indexation issue was easily fool-proofed before any changes went live within DeepCrawl’s safe ‘n sound testing environment. Equally, adding canonical tags to send search engines to an equivalent (more authoritative) page, to make sure link-juice wasn’t diluted across the site unnecessarily, giving the right pages the best chance of ranking.

Sets of pages that should have been treated as a group, were easily identified using DeepCrawl’s Pagination report, and assigned to the development team to be paginated (again using the task manager). Likewise, pages that needn’t have been indexed or crawled, were assigned, with a no-index or disallow task. These tasks were tracked over time to monitor the progress made, and see whether previously hundreds of thousands of indexed pages were now nested within the non-indexable pages report for example. For pages that shouldn’t have been landing pages at all, well those were just wiped out. Who said you never delete content to grow!

Crawling by Levels

Pure continued to re-crawl the site – still limiting by levels – to prioritise how many levels into the site AutoTrader’s landing pages were. DeepCrawl made it easy to spot landing pages at high levels (up to 100 levels), thin and potentially removable content, and opportunities to re-architect the site itself.

The Results

Pure optimised the right pages for Google to crawl, and in the right areas of the site – e.g. within a few clicks of the homepage, and got the site down to 16 levels. This helped AutoTrader to maintain the position of the clear market leader for new and used car related searches in South Africa!

The results of the on-site optimisation are clear:

  • Reduction of 75% redundant pages
  • Improved crawl efficiency
  • 3 months of sustained SEO visibility growth

“Reducing a prolonged downward trend in such a short space of time is very encouraging to see. For a website as large as AutoTrader’s we were looking for a scalable, cloud based crawling tool to give us confidence that changes we make will deliver results. DeepCrawl is becoming an important tool in our SEO arsenal.”

Berian Reed, MD, Pure Optimisation

Click here to see how DeepCrawl could become an essential part of your growth story.

Get the latest in SEO

Sign up for the DeepCrawl newsletter and keep up to date with trends, technology and events.