Easing Our Workload
DeepCrawl massively helps out with our workload. The ability to have the data already, rather than having to create multiple excel sheets for each issue is great.
The task manager feature in DeepCrawl 2.0 is also great, it allows you to create tasks for each issue based on what your priority is and rather than having one huge list of technical problems, you’re able to break it down into easy to manage sections.
Being able to share this with other stakeholders adds value to our offering, an extra convenience and updated method of data flow between us, until issues are resolved.
Discovering Site Issues We Were Unaware of
There are plenty of DeepCrawl features that help discover issues that would take a long time to diagnose and act on manually, such as the Orphaned Pages report. Having this data available after a crawl helps identify which pages are getting traffic and where they are getting traffic from. This is useful across our teams, as it enables us to decide whether or not specific pages should be put in menus or navigation bars or linked on category / service pages, to further build and maximize on the traffic they already receive.
Being able to crawl http and https and section-off a large website to crawl a sample of a large site has proven very efficient and useful.
Increasing Client’s Site Visibility, Rankings and Sales
Using the task manager tool, we can quickly go through the list of issues and prioritize the issues we think will have the biggest and quickest impact, such as prioritising pages with short titles, meta descriptions which are too long / missing which is going to help a websites CTR in the search results. This should, ultimately, help to improve client’s website conversions.
We also combine data from DeepCrawl and other SEO software that provide performance metrics to build a larger picture of site/performance issues per page to then prioritise actions such as content consolidation, improvements to optimisation, CRO improvements and so on. Let’s face it, we live in a digital age where more and more site visits are coming from our phones. One of DeepCrawl’s new reports to cater to rising the phone tide, is the aptly named Mobile report. Nested within it is data on the mobile configuration (or lack thereof) for every page crawled. Whether a page is missing mobile compatibility, set up as responsive or dynamic will be flagged, so you can steer your focus to optimizing for mobile traffic.
We use data from DeepCrawl in all our monthly reports to highlight issues that have been reduced or resolved as well as to show where an issue is becoming more of a problem. This also helps us to easily discuss with a client or developer to explain the issues and get a schedule put in place to rectify any issues.
The main section of DeepCrawl we use is the content section. This highlights all of the basic SEO issues on a website that, in most cases, can be fixed quickly and easily by clients themselves and return the quickest and most effective improvements to a client’s rankings and CTR in the SERPs.
DeepCrawl does a brilliant job of displaying data in an easy to digest format and there is no need to spend valuable campaign time exporting and sorting data in excel sheets like with a lot of tools, which allows us to spend more time speaking with our clients on how we can put in place a plan to resolve all the issues and help to make a clients site much more user friendly.
What do You think of DeepCrawl 2.0?