Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Search Console Tips

Google Search Console (previously called Webmaster Tools) is a free tool provided by Google for website owners to monitor performance and traffic. GSC also offers search engine optimization recommendations and fixes. Our SEO Office Hours recaps below cover a range of advice from Google to help you better understand and get the most out of this fundamental SEO tool.

Manual Penalties are specific to www/non-www and HTTP/HTTPS

Manual action penalties are specific to www/non-www and HTTP/HTTPS domains so you need to have them all set up in Search Console to see them.

29 Jul 2016

s Are Recrawled Periodically

Google will remember your 404 URLs for a long time, and periodically recrawl them to see if they are still 404. These will be reported in search console, but are perfectly fine.

8 Jul 2016

Some Search Console Reports are Sampled

Some search console reports are based on a significant sample of the primary URLs, and won’t include every possible URL. e.g. Structured data and AMP reports.

1 Jul 2016

Robots.txt Overrides Parameter Settings

URL Parameter settings in Search Console are a hint for Google, and they will validate them periodically. The Robots.txt disallow overrides the parameter removal, so it’s better to use the parameter tool to consolidate duplicate pages instead of disallow.

17 May 2016

Mobile Sites Don’t Need Sitemaps

Separate mobile sites should be canonicalising to the desktop page, so you don’t need to submit them to Google via a Sitemap, but it’s still worth adding to Search Console.

17 May 2016

Set Geographic Regions for non-ccTLDs on CDNs

If you have a generic top-level domain which doesn’t specify a country, and you use a CDN, you should make sure you have the geographic settings applied in Search Console.

8 Mar 2016

Google Queues Large Volumes of New URLs

If Google discovers a part of your site with a large number of new URLs, it may queue the URLs, generate a Search Console error, but continue to crawl the queued URLs over an extended period.

8 Mar 2016

Disallow Rule Must Start with a Slash

If you’re specifying a path in the robots.txt file, you must start with a slash, not a * wildcard. This was always true, but was only recently added to the documentation and Search Console testing tool.

26 Feb 2016

Search Console Search Analytics Includes Universal Search

The Search Console impression and clicks data includes universal search results like images which might be misleading.

12 Feb 2016

Use Fetch and Render to Test Overlays

If you have some kind of overlay or pop-up on landing pages, you can check what it looks like to Googlebot using the Fetch and Render tool in Search Console, and check the impact on engagement using web analytics.

12 Feb 2016

Back 21/23 Next