Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

29 Quick Tips to Get More Out of Google Search Console

SEO and Digital Marketing Best Practices

Guide

Google Search Console, Webmaster Tools… call it what you like, Google’s SEO suite needs no introduction for any technical SEO.

But are you getting the most out of it? Here are 29 tricks we use to maximize our use of this tools powerhouse.
 

Permission levels

  1. ‘Full’ isn’t the highest permission level, and means you’ll miss out on critical alerts. Make sure you and everyone who cares is set as an owner.
  1. Google now only sends most site notifications to the direct owners of the site; owners of parent properties will no longer be sent messages for their child properties.

 

Verification

  1. Ensure your development team and everyone who has access to delete files on the web server understands the importance of verification files and meta tags: these can often get deleted accidentally. DNS methods are surprisingly easy to add because sysops changes are generally quicker, and the least likely to ever get removed by accident.
  1. The Google Analytics method seems to fail if there is more than one analytics tracking code on a page.

 

Links data

  1. The Latest Links report gives you the last 100,000 links found by Google. For a small site, this will probably cover its full history going back many years.
  1. Unfortunately the Latest Links will only give you the source URL, so you have to crawl those links to get other data such as target URL and anchor text.
  1. If you want variety, the Linked URLs list is a great source of URLs as it includes new and old URL structures.

 

Sitemaps

  1. Submit ‘multi-dimensional’ Sitemaps for the highest level detail on indexing. Use matrixes of Sitemaps with the same URL included in multiple Sitemaps to identify the most likely candidates for non-indexing (for example: a property site could have one Sitemap for sales and one for lettings, which collectively contain every property, then other Sitemaps organized by number of bedrooms, which repeat all the properties again, giving a second view on indexing).
  1. Submit RSS feeds as well as a Sitemap to help Google index updates faster. Since they only contain the latest updates to your site, RSS feeds are usually smaller and are downloaded more often than Sitemaps.
  1. Add PubSubHubbub (as recommended by John Mueller) for the fastest possible indexing, better even than a Google News Sitemap.
  1. You can submit an XML Sitemap with expired pages (that return a 404) to help get them removed from the index more quickly. It’s best to put them into a separate Sitemap so you can see them separately to other indexable URLs. You should remove them once they have been removed from the index, to make sure index counts reflect the real state of your site.
  1. You can also submit redirecting URLs as a one-off to get the redirects discovered quicker. Submit Sitemaps with changed URLs; Search Console will report errors, because the Sitemap only contains redirecting URLs, but this isn’t a problem in this case.
  1. Mobile Sitemaps are for feature phone pages only, not smartphone-compatible pages. If you have smartphone URLs that are different to your desktop URLs, you don’t need to submit them at all.

 

HTML errors

  1. Search Console will report pages as having duplicate titles, even if they have been canonicalized.
  1. The Soft 404 errors report in Search Console is a quick way to identify thin pages that Google doesn’t think have any original content.

 

URL parameters

  1. The URL Parameters tool is probably the most valuable crawl efficiency tool out there. It overrides other systems to allow URLs to be cleaned before they are even seen by the crawler. You can even structure your URLs within the tool to ensure you get exactly what you need. For more information, see our parameter management guide.

 

Save your history

  1. It’s a good idea to save exports from key Search Console reports so you can build a history of the site and refer back if any problems crop up. We’d suggest saving reports for up to three years, but this is only a guide. You could start with the following reports:
    • Latest Links (most useful for sites with lots of backlinks)
    • Crawl Errors
    • Search Analytics data (Clicks, Impressions, CTR, Position)

 

Weaknesses and bugs

  1. Crawl Errors in News Sitemaps are delayed by two days, so you won’t know why an article was rejected until a few days after it is published.
  1. Graphs in Search Console show Crawl Error data based on crawling, so changes happen when pages are recrawled, or expire after 90 days. To get more meaningful results, plot the data by the number of errors occurring within your required timeframe.
  1. The Search Console Robots.txt testing tool doesn’t work exactly the same way as Google, with some subtle differences in conflicting Allow/Disallow rules that are the same length. The robots.txt testing tool reports these as Allowed, however Google has said that ‘if the outcome is undefined, robots.txt evaluators may choose to either allow or disallow crawling. Because of that, it’s not recommended to rely on either outcome being used across the board.’ For more detail, read this Webmaster Central Help Forum discussion.

 

Messages

  1. Here are all of the messages we have detected:
    • 404 not found errors
    • AdWords request to import organic search
    • Associate a YouTube channel
    • Authorization permission errors
    • Big traffic change
    • Change of address
    • Changes to sitelinks
    • Chrome is phasing out support for the Silverlight plugin
    • Crawl rate expired
    • Disavowed links updated
    • DMCA notice
    • Fix app deep links
    • Fix mobile usability issues
    • Google Analytics link
    • Google can’t access your site
    • Google+ link approved
    • Google+ link request received
    • Googlebot for smartphones found an increase in errors
    • Hacking suspected
    • High number of URLs
    • Improve search presence
    • Increase in not followed pages
    • Make your site ready for the new sitelinks search box
    • Malware
    • New feature invitation
    • New geographic target set
    • New verified owner
    • Notice of removal from Google Search
    • Owner removal incomplete
    • Phishing notification
    • Possible outages
    • Preferred domain change
    • Reconsideration request
    • Request to import organic search
    • Server errors
    • Soft 404 errors
    • Suspected hacking
    • Thin content
    • Unnatural inbound links
    • WordPress update available

 

Google Safe Browsing

  1. Search Console will incorporate the data from the Google Safe Browsing tool; the tool itself will show you if you’re linking to any hacked sites or if you’re hosting malware.

 

URL removal tool

  1. The URL removal tool will remove all www/non-www and http/https versions of the URL you’ve specified. So the URL removal tool cannot be used to resolve problems with URLs on the wrong protocol.

 

Search Analytics

  1. If you appear multiple times in the same SERP, it only counts as a single impression in the new Search Analytics dashboard (this includes local results). The highest position from the results is taken and used for average calculations.

 

Geographic Targeting

  1. If your site is international and doesn’t target any country in particular, you should choose ‘Unlisted’ in geographic settings (under Search Traffic > International Targeting > Country) rather than not specifying a targeted country.

 

Change of Address tool

  1. When migrating from HTTP to HTTPS, the Change of Address tool isn’t necessary.

 

Using Search Console with DeepCrawl and Robotto

  1. Export the URLs from the Search Console Crawl Errors report and use DeepCrawl to recrawl the URLs; this will give you an updated report on their status and allow you to rule out any errors that have been fixed since they were found by Googlebot.
  1. Our other tool, Robotto, will download your crawl errors, and connect with Google Analytics data to identify removed pages that were driving entry visits.
  1. Robotto also stores a history of your Search Console messages in case the originals are accidently deleted, and alert you if your sites trigger a security warning.
  1. Pull Search Console data directly into DeepCrawl with our GSC integration.
Avatar image for Alyssa Ordu
Alyssa Ordu

Alyssa is a keen traveller, cocktails & dad jokes enthusiast who does Marketing, in that order. A lover of outreach, connect with her for opportunities to collaborate, or exchange a pun or two.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox