Jacqueline Urick, SEO Manager at online retailer Sears PartsDirect, was able to reverse a 15% YoY decline in organic traffic and $2,000,000 loss in revenue thanks to historical DeepCrawl reports, which helped her identify a critical problem in the website’s configuration.

The PartsDirect site was halfway through a gradual migration, when a problem with the site’s setup starting causing users to be directed away from product pages in the search results. The logs for an automated crawl set up months before held the key to finding and fixing the issue.

“The key lesson is to always set up multiple crawls. One weekly, one post-release, one during maintenance. I couldn’t have identified this issue without access to DeepCrawl’s historical data.”

— Jacqueline Urick

The Background

Sears PartsDirect is an online appliance parts retailer with over eight million items, and a complex legacy site configuration.

The PartsDirect site had also suffered several malicious denial-of-service attacks, which had lead to the implementation of safeguards to ensure site stability. One of these was a load balancer, which directed search engines to a bot environment separate to the main site.

Jacqueline decided to set up some regular crawls for monitoring purposes.

One of these was scheduled to run on a Saturday night, when the server maintenance took place.

KEYWORD DRIFT

The first indications of the problem came from an unsettling amount of keyword drift. Before, PartsDirect had consistently ranked as number one for the affected search terms, but now terms were ranking on unexpected internal pages, or deranked completely. That these terms were ranking on unusual pages also make for a poor user experience.

Jacqueline and her team gave it a couple of weeks, assuming that this was something to do with the new release, or even seasonal flux. However, the trend did not reverse – it worsened. The following diagram shows the decline of average rank and positions for the affected terms. The green ring indicates when the fix was made:

Jacqueline started to see search terms bring up the site’s “Out of Service” page – the splash page that users saw when the site was in maintenance. When Jacqueline saw the site’s OoS page ranking in the SERPs, she realized that this could only happen if there were serious problems in how bots were crawling the site during maintenance.

The Diagnosis

The problem lay with the misconfiguration of one of the server environments.

Jacqueline came to realize the root cause when she consulted the recurring crawl that had been running during maintenance. Something caught her eye while checking reports – large numbers of 302 redirects.

The other scheduled crawl (that did not run during maintenance), returned results that looked completely normal. She was curious, and decided to investigate further.

It turned out that the scheduled crawl happened to be hitting the same server stack as Google, and was therefore being redirected as though it were a bot. Because the “Out of Service” page was resolving with a 0 status, when Google and other search engine bots crawled during this period, it appeared as though the pages were broken or moved, not temporarily unavailable. Only a crawl scheduled to run during a maintenance period would have detected this issue.

In the highlighted portion of this diagram, the bot environment was completely broken, meaning that the crawls came back almost immediately.

“95% of the time, everything was running as expected. It was only during the >5% window of time that everything was going wrong. That’s all the time needed for millions of dollars to disappear - that’s how fickle Google can be. The bots understood that our site was relevant, but it appeared essentially as broken.”

— Jacqueline Urick

The Recovery

After Jacqueline identified what was happening with the Sears site, she quickly acted to reverse the slide in rankings. Eliminating the bot environment entirely meant that users now see pages the same way as search engines, without being redirected to an alternate environment.

“I’m a big advocate of DeepCrawl. I often find myself explaining to people how the tool can help do a given job faster and more accurately. We’ve set up way more regularly scheduled crawls at various times of the week, to ensure that we’re constantly monitoring the site for potential problems. We find that a post release crawl, especially, is essential.”

— Jacqueline Urick

Jacqueline’s knowledge of SEO and her access to tools made it possible to turn the situation around. Sears PartsDirect had been looking at a potentially catastrophic loss of organic traffic, and a resulting plunge of $2 million in revenue, if not more. Now, with regularly scheduled crawls and regular monitoring, PartsDirect feel secure about their website architecture.

Building Bridges

The effects of this solution were far reaching. Jacqueline’s role meant that she often had to bridge the gap between content, marketing and development teams. She found that this victory permanently altered how her role was understood and appreciated, and found herself invited to internal meetings and asked for advice on how to best implement SEO best practice.

“This process helped the development team learn why it is that marketers do what they do. It can be tricky, but it’s really important all sides understand each other’s perspective. Now, even our paid team uses DeepCrawl to test their campaign URLs.”

— Jacqueline Urick
Request a Demo
Find the right DeepCrawl plan for you

Get the latest in SEO

Sign up for the DeepCrawl newsletter and keep up to date with trends, technology and events.