Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Web Spam

Web spam is a tactic used by webmasters to manipulate search engines in order to perform better within search results. This may be through spam links, hacked content or malware from third party tags, amongst many other unnatural methods. Our SEO Office Hours notes below cover how Google handles what they deem as spam, along with advice for avoiding it.

Learn more about SEO best practices for website content in Lumar’s Website Intelligence Academy

Google Takes The Disavow File into Account Immediately

When re-crawling the URLs specified in the disavow file, Google will immediately drop these. However, they will still appear in the Links Report seen in Google Search Console.

3 May 2019

Use URL Inspection Tool to Fetch Live HTML & Check if Site Has Been Hacked

If it isn’t visually apparent that a site has been hacked but it’s ranking for unusual terms, use the URL Inspection Tool in GSC to fetch the live HTML of the pages that are ranking to check for hacked content that Google may be seeing.

8 Apr 2019

Reduce Number of Unique Downloads if Seeing ‘Uncommon Downloads’ Issue in GSC

The ‘uncommon downloads’ warning in the Security Issues section of GSC can be caused by sites that generate unique downloads for each user. Reduce the number of downloads so Googlebot is able to check them easily before a user accesses them.

7 Apr 2019

Use URL Inspection Tool to Fetch Live HTML & Check if Site Has Been Hacked

If it isn’t visually apparent that a site has been hacked but it’s ranking for unusual terms, use the URL Inspection Tool in GSC to fetch the live HTML of the pages that are ranking to check for hacked content that Google may be seeing.

19 Mar 2019

Malware Can be Served via Your Site if You Serve External Content via JavaScript

If you serve content from 3rd parties via JavaScript then be aware that if the websites hosting the content get hacked or edit their JS code to serve malware, then Google will flag your site as serving malware.

16 Nov 2018

Reduce Number of Unique Downloads if Seeing ‘Uncommon Downloads’ Issue in GSC

The ‘uncommon downloads’ warning in the Security Issues section of GSC can be caused by sites that generate unique downloads for each user. Reduce the number of downloads so Googlebot is able to check them easily before a user accesses them.

16 Nov 2018

The Web Spam Team is Actively Working to Deal With Sites Manipulating Rankings With Expired Domains

The Web Spam Team is actively working to deal with sites looking to manipulate rankings using expired domains. John says that this isn’t a simple loophole that’s being exploited, but there may be cases where people are getting away with these spammy techniques.

24 Aug 2018

Domain Migrations Take Longer if the New Domain Has Problematic History

Google can take longer to understand pages following a domain migration when the new domain has a problematic history e.g. spammy content and/or links.

1 Jun 2018

Keyword Stuffing on Homepage Can Cause Lower Level Pages to Rank Instead

There are instances where Google may rank a lower level page in place of the homepage. This might happen if Google detects a lot of keyword stuffing on the homepage and doesn’t know if the page is relevant, in which case another lower level page may rank instead.

1 Jun 2018

Resubmission Requests Can be Sent Via HTTP or HTTPS Version in GSC

When submitting a reconsideration request it doesn’t matter if you do this through the HTTP or HTTPS version in Search Console. However, messages from Google may be sent to either or both versions in Search Console.

1 Jun 2018

Back 2/5 Next