We’ve been busy for the last few months working on another feature-packed release, which includes a mix of features, interface improvements and bug fixes.
Please note the new Pagination feature, which might change your Unique Pages count, but we think this is a major improvement to measuring your site architecture so it’s worth understanding why.
Simplified Crawl Controls
DeepCrawl already includes powerful settings that let you limit the depth or number of pages in your crawl, but they can be hard to understand for first time users.
We wanted to make the tool more accessible for everyone but retain the level of control our users appreciate.
When you create a new project from now on, you will be offered fewer options. A number of the settings including the user agent and email alerts have been moved to Advanced Settings. A number of other options which directly affect the scope of your crawl have been added.
If you try to add a domain that redirects, or doesn’t respond, you will be warned and given the opportunity to change it.
After creating your new project, you will also notice some changes to the Crawl Control screen.
You can now choose a max crawl size from a list of defaults. The crawl will automatically finalise as the last complete level before the limit is reached.
If you choose ‘Custom’, you can use the existing max pages and max levels restriction settings as normal.
Google has started to treat paginated pages – that use rel=next and rel=prev markup – as a set of pages, rather than individual pages.
DeepCrawl will now detect your pages with rel next/prev markup and categorise them accordingly.
This new feature provides a more accurate count of unique pages, and will help you test your pagination markup implementation.
We’ve added three new reports to show you your pagination setup.
A page with a rel=prev tag is not the first page in the set of results. We call all these pages ‘Paginated’ and now report them separately to Unique Pages.
This means that the unique page count is a more accurate reflection of the valuable pages on your site.
Any page with a rel=next but no rel=prev must be the first page in a set of paginated pages. We call these ‘First Pages’. They are treated as Unique Pages but are also included in a new report underContent > Pagination so you can see them.
Unlinked Paginated Pages
Any new URLs discovered in the rel next/prev tags will be crawled even if they are not linked from any other pages. If they are not linked anywhere on the site then this could be an error so we include them in a new report called Unlinked Paginated Pages.
Title Pixel Width
Google SERPs now display title tags based on the pixel width of your title text and the amount of text shown has been reduced.
To reflect this, DeepCrawl will now measure the length of your titles in pixel widths.
Unfortunately, there is no title length that will be shown in all searches: for example, the title length shown on searches conducted on mobile devices is much shorter than on a desktop. Therefore, we have given you more control over when you will see maximum title length warnings in your report.
In Report settings, you can adjust max title length in pixels and also a lower ‘safe’ title length according to your project’s requirements.
Report> Content> Max Title Length: This report will now use the above pixel measurements instead of the character count.
Report> Content> Max Title Warnings: This new report will show all titles that are longer than the safe length, but shorter than the maximum length, as defined in report settings.
Improvements To Issues List
Our issue tracking function has been significantly improved.
Issues can be created from any report including any filtering you have applied.
Once you have flagged an issue, we will track the number of instances of the issue in the report. You will be shown a trend of how the issue changes over time.
You can now include other users who can also add notes and they will receive an email after every crawl with an update.
You can also see all issues across every project, in a single view, from the top level of the interface.
Duplicate Precision Settings
As well as wanting to see duplication caused by identical pages, most SEOs also want to know about duplication on similar pages.
However, everybody has a different opinion of what they consider to be a duplicate page.
To help give you control, you can now adjust the sensitivity of the duplication detection filter.
Increasing the Duplicate Precision option, in Report Settings, to above the default value of 2, will require a greater degree of duplication before a page will be marked as a duplicate. Reducing the value will require less duplication.
Improved Report Filenames
The filenames for reports are now more descriptive and therefore easier to work with.
Change Reports From TSV To CSV
All download reports will now default to CSV.
Googlebot User Agents Update
Google recently made some minor changes to their mobile user agents. These changes are now reflected in DeepCrawl’s user agent options.
We hope that all these changes are intuitive but let us know what you think.
We’re planning some major improvements in the next version, which will take a bit longer than usual, but it will be worth the wait.