ROAST is an online performance agency. They specialise in search optimisation, content creation & distribution, paid social, mobile, display, affiliates, data, analytics, social listening and market intelligence.
ROAST’s Strategy Director, Kieran Bass, has experience dealing with the challenges presented by extremely large websites. Prior to joining ROAST, Kieran worked agency side with many e-commerce clients, whose sites often had over 100,000 URLS.
At retailer ASOS, Kieran led the international SEO campaigns. There, the challenge was even greater – understanding and optimizing a site which ran to millions of URLs and spanned multiple territories.
Kieran remembers having to perform site analyses manually, which became more and more challenging over time.
“"Before DeepCrawl, I once analysed 30,000 links manually for a small insurance client over the holidays. I can honestly say I did not want to look at another link again in my life."”
Not only was manual analysis extremely time consuming, it didn’t give Kieran the holistic understanding he needed to create effective strategies.
As sites continued to grow in both size and complexity, Kieran could see that he needed to establish an accurate picture of a new site’s technical architecture without laborious manual analysis. This meant seeing not just how many pages a given site had, but how they interacted together.
With the help of DeepCrawl, Kieran found that he was able to get a complete view of a site rather than just a snapshot, and that he could do so in a fraction of the time it took him before. His process is outlined below:
First, conduct a technical audit to get a complete overview of the structure, content and internal linking.
Then, run an initial crawl to gain an understanding of the site before breaking it down into digestible chunks.
Finally, schedule automatic crawls to start building up a history of the site and track ongoing performance.
There may be some obvious recommendations that he would make immediately, but when auditing a site, Kieran and the ROAST team usually combine DeepCrawl data with other data sources such as GetSTAT.
This method provides an unparalleled understanding of even the largest websites. For larger sites, a partial crawl would often be appropriate. Kieran explains:
“"You have to approach each site on a case by case basis and crawl the entire site if that’s manageable, or crawl a portion of the site if that feels more appropriate."”
Kieran also found that he could use DeepCrawl for competitor analysis. By running a competitor matrix report, and uploading backlink data from Ahrefs or Majestic into DeepCrawl, he could see where competitors were focusing in the market, and adapt his strategy accordingly.
“"It gives you an insight into their business strategy and goals."”
The backlinks data also helped Kieran to identify opportunities. For example, if several competitor’s sites linked to a popular news site or trusted key influencer, and those links were absent from his client’s site, Kieran could make the case for outreach to that source.
In the case of ASOS and many other sites, the challenge lay with understanding and optimizing a site which ran to millions of URLs and spanning multiple territories. Kieran relied on a Universal Crawl, which analyses a website, XML sitemaps and organic landing pages in one hit.
He then refined the settings based on his analysis, and broke it into sections using positive restrictions:
He then looked at the site explorer tab, to see crawl paths and linking.
This helped Kieran and his team to identify a host of issues including duplication, inconsistent journey paths, incorrect redirects, errors in the navigation or links pointing to the wrong pages.
Kieran’s next goal was to help drive more organic traffic to his client’s site. He already had access to a great deal of data through keyword research. Kieran explains:
“"We then conducted thorough keyword research, because in this way you identify user behaviour. There were approximately 2.5m keywords per territory. We realised people were searching for content but not finding it."”
Kieran saw that he could use DeepCrawl’s custom extraction tools to identify potential opportunities, so he used a regex to pull keywords from a competitor’s site’s metadata and social media tags. This provided him with a list of keywords the competitor was currently optimizing for.
By combining ROAST’s keywords data with the data scraped with DeepCrawl, he now had access to powerful insights about gaps in his client’s keywords strategy.
Kieran would use these insights to ultimately change the marketing direction and pave the way for significant international growth, by expanding the site with content to match keywords. He was also able to simplify the user journey path by tweaking the navigation and internal linking system.
“"Using DeepCrawl means I can spend more time making recommendations and changes rather than trawling through the data. If you combine my hours, the team’s, and all the tasks involved, you’re looking at adding an extra person to the team on a full-time basis to cover off what DeepCrawl does for us. With all this time saved on the data, we’re able to spend more time on solving problems and driving performance for our clients."”
Kieran has now firmly installed DeepCrawl at ROAST and they are using it as one of their core data sets for both initial strategic work and ongoing monitoring and optimization. Kieran’s competitor analysis and optimization strategies have consistently lead to substantial increases in organic traffic and revenue.
Kieran Bass is a member of the DeepCrawl Customer Advisory Board.