Backlinks are links from other websites to yours: they bring users, PageRank, and link equity to your website which can trickle through your website’s architecture.  

We regularly see cases where a website has attracted great links from other websites, however over the years these websites change. Products get discontinued, platforms are migrated, we all move to more secure URLs, and backlinks are forgotten. This is a big problem for many websites: for some, a large portion of their authority is tied to pages which no longer exist.

Today we released six new backlink reports to help you identify issues and monitor the health of your backlinked pages.

Broken pages with backlinks are potentially the most severe, yet easiest to fix backlink issue. This is normally caused by URLs which were linked to useful pages in the past, but those destination pages no longer exist. This happens for several reasons: products which were discontinued; legacy URL structures which were lost during a site migration; and pages which have been renamed or moved around the site.

Not only does the PageRank/link equity to these pages get completely lost, but so do any users who happen to follow those backlinks.

This report will show you every URL which has backlinks, but returns a 4xx/5xx status code. You should address this issue by either setting up redirects to a relevant page, or restoring the page to bring it back to life.

Set up a lost backlink landing page alert:

You can be instantly notified of any backlinked URLs which begin to return an error page. Set up a task for this report, and we’ll notify you of any changes to this report the next time we crawl your website. Find out what’s been  added, removed or is missing, all in one place.

This report shows URLs which have backlinks but are block crawling due to your robots.txt file. As with error pages, search engines are unable to pass any PageRank or link equity to these types of pages. This report is a quick way for you to make sure your best backlinks aren’t accidentally being blocked, causing you to miss ranking opportunities.

Redirecting URLs which have backlinks are not necessarily negative: there are several instances where this would be the correct setup. However, it is important to review these redirecting pages to ensure that they are sending users and link equity to the right pages.

Some of your backlinked pages may be non-indexable – they could be canonicalised to another URL, or have the ‘noindex’ robots directive. These pages are not strictly a problem, but are often an opportunity.

Canonicalised pages may have gained backlinks because they are usefulto users, but were not discoverable in search. It is worth reconsidering whether these pages should be indexable. For instance, if your “Silk Dresses” page canonicalises to your main “Dresses” page but websites are linking to it, it makes sense that search engines may also want to send users there.

This report keeps track of your backlinked pages which have no links to other pages, or contain the meta nofollow tag. To help users find other parts of your website and let search engines pass PageRank/link equity, your backlinked pages should contain links to other pages. Solving this issue can be as simple as a link to your homepage, or as comprehensive as including your core navigation bar. A meta nofollow tag creates the same effect as having no links – search engines are unable to pass link equity to the rest of your site.

To take advantage of these backlink reports, you’ll need to upload information about your backlinks within the Crawl Sources tab.

We only support CSVs in one format: the default Google Search Console CSV.

Go to Google Search Console and find the ‘All Linked Pages’ report. Select “Search Traffic” > “Links To Your Site” then choose the “More >” link under “Your most linked content” (or simply click here and choose your site profile)

Get the CSV by clicking “Download this table”

Upload this CSV directly to the Backlink tab of the Crawl Source Settings in DeepCrawl. You can upload several backlink CSVs, so repeat this process for each subdomain that you have verified in Google Search Console.

The data from Google Search Console is normally sampled, which makes it difficult to know if you’re getting a full picture from their exports. If you have access to a third party link tool such as Majestic, Ahrefs, or OpenSiteExplorer, you can also upload that data.

The only requirement is that you organise the data into a CSV spreadsheet with the columns “Your pages, Links, Source domains”. To make life a little easier, we have created an Excel macro to automatically reorganise this for you:

Download our Excel CSV reformatter

Majestic

Find your site in Majestic, choose the “Pages” report, and export this data to a CSV.

Copy the following columns into a new spreadsheet and rename the column headers:

URL – rename to Your pages
ReferringExtBackLinks – rename to Links
ReferringExtDomains – rename to Source domains

Save this worksheet as a CSV and upload it to DeepCrawl.

Ahrefs

In Ahrefs, choose the Pages > Best by links report, and export this data to a CSV


Copy the following columns into a new spreadsheet and rename the column headers:

Page URL – rename to Your pages
Dofollow Backlinks – rename to Links
Referring Domains – rename to Source domains

Save this worksheet as a CSV and upload it to DeepCrawl.

Open Site Explorer

In OSE, choose the Top Pages report and export the data to CSV


Copy the following columns into a new spreadsheet and rename the column headers:

URL – rename to Your pages
Total Links – rename to Links
Number of Linking Root Domains – rename to Source domains

Save this worksheet as a CSV and upload it to DeepCrawl.

Try DeepCrawl for Yourself

Get the latest in SEO

Sign up for the DeepCrawl newsletter and keep up to date with trends, technology and events.