How Digital Marketing Agency Uses DeepCrawl to Increase Productivity and Data Analysis

The Agency is a leading digital marketing agency with an innovative approach to marketing businesses online. With an award-winning blog and a team of driven, forward-thinking consultants, offers a successful combination of agency experience and an in-house level of client business knowledge. The team works on a mix of digital marketing projects including search engine optimization, paid media, creative content strategy and creation, digital PR and training.

Charlie Williams is the Head of Marketing, and editor, at Charlie has worked with since 2013, and has helped a broad range of clients – from enterprise to start-ups – run search engine optimization and content campaigns.

“My day usually consists of developing SEO or  content strategies to give our clients websites that answer real user needs and that search engines love.”

Initial Impressions is fairly new to DeepCrawl. Over the past 5 months, they have begun to integrate it into both the sales process to win new business, and their existing clients’ technical SEO site audits projects.

Charlie and his team find DeepCrawl’s ability to quickly and effectively provide a complete site overview invaluable.

“The way we find it really useful is in reporting – doing an audit at the beginning of a project to spot big issues. What we like is that the data is nicely segmented. We can quickly find, and tell clients, about specific problems that we can try and improve.”

For agencies like, DeepCrawl offers a unique collaborative approach to SEO, as the tool can be used by multiple team members across a variety of client projects. DeepCrawl’s comprehensive SEO audit abilities and reporting eliminate the need to constantly seek out and learn new tools specific to each project’s individual needs.

The Assessment

When assessing a potential client’s site, Charlie and the team at begin by running a crawl to gather a full understanding of the site’s architecture and makeup. This allows them to clearly present issues to the client, gain an understanding of the level of technical foundation work to be done, and helps predict a timeline for the project.

“That’s really key for us – the ability for us to really quickly give a great overview, dig in a little bit, and sort of predict, and give, a quote for how much technical work is required. {…} You can build lots of great content, but if Google can’t see it there’s no point.”

The Process

With DeepCrawl, the team is able to prioritize major issues. Typically, they use the details of the crawl report to examine the site’s indexable pages vs. pages crawled, crawl depth, canonicalization issues, plus robots.txt, and XML sitemap problems. DeepCrawl’s ranking system, “DeepRank”, assists in this process, as high priority tasks are displayed first.

“The great graph {that displays} the number of pages by crawl depth. That’s a really important report for us, as we can actually say, ‘Is your site structure functioning as it should be – is the flow of information both for search engines and users logical?’


Next, duplicate content reports are examined. The team dives into unique pages, duplicate pages, and canonicalized pages. The goal is to determine if the right content is being crawled, and assess and eliminate problem pages immediately.

Finally, quick to fix issues are examined – like server response codes and meta data- to give an overview of all the improvements that can be made. before DeepCrawl

Before DeepCrawl was introduced at, the team relied on a variety of tools, and combinations of tools, working together.

“The advantage of Deepcrawl is speed and power. Speed: you run it away when you’re doing other work, but also you’re able to share immediately results with everybody. Anyone who needs to see it, you can send them a link, {…} and they’ll see the same reports you’re seeing.”

Previously, the team had been burdened by desktop SEO crawler programs, running on their computers’ processors, making it difficult to do other work. Sharing was an issue too, as data had to be exported to Excel, and sent via email – unorganized and cumbersome.

For large site audits, the team found DeepCrawl cloud based tool ideal, as it completely freed their computers up for other tasks.  They were also able to crawl multiple sites at once, and with historical data could compare crawls over time. Eliminating the tedious manual work, required by other tools to share and view reports, freed up more time to analyze and interpret the data.

“We used to make a crawl depth graph manually. Now we don’t have to, because the graph is already there. If you want to share it, or do a screen grab and put it into a report, you can, but the point is that the data’s already there for you, and the visual impact is BANG.”

Impact of DeepCrawl has integrated DeepCrawl into their SEO audit routine – ensuring each team member has a comprehensive view of each clients’ site’s architecture and the major issues affecting the site.

One of the most impactful features is the ability to review historical crawls and track changes over time.

“If you set up a crawl regularly (once a week or once a month), you have a crawl, go away and make some changes, and then you can see the next month the number going up or down. So you can see if what you’ve changed has actually had an effect or not. (…) It’s lovely to be able to see the comparative results.”

Today,’s team spends less time manually extracting the important data, and compiling reports, and more time finding solutions using the clearly laid out results presented in the DeepCrawl report.
“What DeepCrawl gives us, is the ability to spend more time addressing the recommendations and improvements our clients want us to make, and less time having to manually sort through the data.”