Bartosz Goralewicz is CEO of Elephate, a Polish digital agency specializing in on-page optimization. When he’s not flying to the US for events and client visits, he’s working on fixing SEO problems – or updating his blog, at goralewicz.co.
Bartosz was approached by the dacodoc.fr team to help out with both their domain’s on-page and off-page SEO. The website had been worked on by several individuals and teams over the years and had not been either consistently maintained or regularly audited. Bartosz suspected that it would need significant improvements to make it Google friendly, and that its internal problems were leading to indexation errors, which were then picked up on by Google’s bots.
First, Bartosz decided that it would be useful to get a general impression of the domain, so that he could put together a clear list of goals for him and his team.
He began with a test crawl of 100,000 URLs, and got this result:
At first glance, the site structure appeared to be excellent, with a good ratio of unique, indexable pages. However, when looking at the Web Crawl Depth graph, only three levels of the domain had actually been crawled.
Bartosz could see that something was not right, and immediately set up a full site crawl to get a complete picture. This time, the result was quite different:
DeepCrawl’s Web Crawl Depth graph showed another perspective on the data:
Hidden behind the apparent good results of the first surface crawl were significant problems. There were clearly indexation issues hindering Googlebot from crawling the site, which Bartosz suspected were a result of the site being constantly updated and modified. This poor indexation meant that Google’s bots were spending their crawl budget inefficiently, and this could potentially have a negative impact on the site’s ranking.
Bartosz was curious as to how exactly Googlebot was seeing the site’s pages, and ran a Fetch as Google to check. This is what he saw:
This was hardly an ideal image of his client’s site. It was clear to Bartosz and his team that the site required massive changes to its domain and internal structure. Armed with an understanding of the way the site was put together, they set out their goals as follows:
- Implement a new and improved internal navigation;
- Encourage a better user flow;
- Improve crawlability so as to make better use of their “crawl budget”;
- Reduce the index bloat;
- Fix the technical SEO issues.
Bartosz describes an example of a problem his team encountered (and sucessfully fixed):
“With every large structure, there is always a risk of small issues scaling up quickly. To give you an idea, this is one of the crawls for dacodoc.fr, where we found an issue with rel-canonicals. The pace of the process and changes was so fast, the URL structure was changed but the rel-canonicals were forgotten.”
The issue lay with canonical tags pointing at old URLs. The result was a redirect/canonical loop:
In theory, this issue could have led to deindexation of all the pages with this problem. However, Bartosz and his team were able to find and identify the cause quickly with their regular scheduled crawls.
“During a project like this, we always go with a full website crawl every week or two. In this case, the pace of change was really crazy and we crawled as often as possible.”
Bartosz and his team had been working on the dacodoc.fr site for three months, while part of his team was also busy working on off-page SEO and tackling issues caused by the Penguin update. In this period, and with limited resources, the Elephate team accomplished the following:
- Changed the entire navigation within the domain;
- Fixed all the issues causing indexation bloats;
- Created new landing pages and implemented a new layout to better support the user flow;
- Fixed the pagination and canonical tag related issues;
- Made many other minor fixes to the site’s internal structure.
A comparison of the crawl results at the start and end of the project show the transformation of the website:
Google Analytics shows the remarkable rise in organic user sessions after Elephate’s work. They had gone from a 30% decline to a 65% increase.
Beyond just the site visualizations, Bartosz was pleased to see that Google was now crawling this domain ten times more often than previously, and with a with a continued uptrend. This was clear proof that the site had been successfully transformed.
While there were a few small issues still to fix, the Elephate team had accomplished everything they had set out to do, and more.
“We were extremely happy with the pace, effects and changes we accomplished - it wouldn’t have been possible without DeepCrawl.”