Backlinks are an important factor in search rankings, as they display trust and authority to pages on a website. However, there are a number of best practices which need to be adhered to, including ensuring links are relevant and not seen as spam. Our Hangout Notes cover these, along with other key insights from Google.

Disavowing a redirected URL may be enough to prevent the passing of poor link signals

December 6, 2021 Source

One participant asked for the best approach when dealing with a redirecting URL that has poor quality backlinks (in this case, a page with 18k spammy links was redirecting to the site homepage). If the rest of the destination URL’s backlink profile is relatively healthy, it may be enough to disavow that redirect alone. Disavowing all of the backlink spam would still be the ideal, but in cases like this the outcome probably doesn’t warrant that additional time and effort.

Backlink spam issues are unrelated to core algorithm updates

November 17, 2021 Source

A site owner mentioned that they saw visibility drops after a core algorithm update and that around that time, they were working to solve technical issues like 404 pages in their sitemap, but also suspected it might also be related to spammy backlinks they had pointing to their site. John replied that if you’re seeing changes after core updates, backlink spam issues are likely unrelated to the update. Core updates are more about understanding your site’s overall quality and relevance, and less about backlink spam or specific technical issues. He emphasized that overall quality and relevance are likely more important aspects to focus on after core updates occur.

Google algorithms can still lose trust in sites displaying a strong pattern of manipulative links

October 30, 2021 Source

Google tries to isolate and ignore spammy or toxic backlinks. However, there still are some cases where a site with a very strong pattern of unnatural links can lose trust and be penalized with a drop in visibility.

Unnatural Links Can Hurt a Site Regardless of Algorithm Updates

November 26, 2019 Source

If you believe your site is suffering from poor link-building schemes, John recommends focusing on removing these unnatural links, regardless of any algorithmic updates. This can be done in a number of ways, including using the disavow file or removing links from the source site.

Disallowed Pages With Backlinks Can be Indexed by Google

July 9, 2019 Source

Pages blocked by robots.txt cannot be crawled by Googlebot. However, if they a disallowed page has links pointing to it Google can determine it is worth being indexed despite not being able to crawl the page.

Web Spam Team Can Issue Targeted Manual Actions Against Pages With Unnatural Linking

May 31, 2019 Source

The Web Spam team can take targeted manual action against websites with unnatural linking by choosing to disregard links to individual or small groups of pages.

Value of Backlinks Changes Over Time as Sites Grow in Size & Adds More Links

May 1, 2019 Source

Backlinks don’t lose their value directly because of their age, but PageRank is distributed more broadly over time as the linking site publishes pages and increases the volume of links.

Google Doesn’t Place Fixed Weight on Backlinks in its Algorithms

April 18, 2019 Source

Google doesn’t place a fixed weight on backlinks in its algorithms, it varies from case to case. For example, a new article may rank well for a trending topic but have few links because it is so fresh.

Rather Than Redirecting Temporary Pages, Encourage Users to Link to More Permanent Pages

February 22, 2019 Source

If you have a page that will only be live for a short amount of time, instead of trying to consolidate link equity through redirects, John recommends encouraging users to link to more permanent pages on your site like category pages.

Related Topics

Internal Linking Anchor Text External Linking Disavow Deep App Links