Google Webmaster Hangout Notes: January 21st 2020

Sam Marsden
Sam Marsden

On 27th January 2020 • 4 min read

Notes from the Google Webmaster Hangout on the 21st of January 2020.

 
 

Including <meta name=”robots” content=”follow”> Has no Impact on Search as it is the Default Value

Using <meta name=”robots” content=”follow”> has no impact on search as this is the default value and Google essentially ignores it. Apart from adding a very minimal amount of additional HTML to the page, there is no value in removing it apart from avoiding running into the same question again in the future.

 

Options For Out of Stock Items Include Noindexing, Returning a 404, Adding Schema or Redirecting to a Replacement

Out of stock items can be dealt with by specifying in HTML and schema to show it is not available. Alternatively, the page can be noindexed, return a 404 or be redirected to a replacement product.

 

JavaScript Redirects Take Slightly Longer For Google to Process Than 301 Redirects

JavaScript redirects take longer than 301 redirects for Google to understand, as the JavaScript needs to be processed first.

 

Hreflang May Not be Necessary For Translated Versions of Pages

Pages with different language content don’t necessarily need hreflang because the text is different so Google would treat them as unique pages and not as duplicates.

 

Keep One Version For Image File Names Rather Than Having Multiple Translated Versions

John recommends against creating translated file names for images and instead keeping one version. Image file name is used as a small ranking signal, but the text on the page it appears on is a much stronger signal.

 

Blocking Crawling of Parametered URLs in GSC Will Mean PageRank Will be Lost

Google sees links between two canonical URLs and if there is no canonical destination URL then it will be seen as going nowhere and won’t be used. Practically this isn’t an issue as links aren’t usually to a specific parameter variation, so it is unlikely to be driving a lot of the site’s PageRank.

 

Safe Search Issues Can Now be Submitted to a Forum Rather Than Directly to Google

Google closed the private channel for submitting sites being incorrectly filtered by Safe Search because many submissions included websites with content which contained adult or near-adult content. Safe Search issues can now be submitted in a forum where people can receive feedback.

 

Check if Site is Impacted by Safe Search by Seeing if “&safe=active” URL Parameter Changes Results

Check if a site is being blocked by Safe Search by performing a site: query, turning on Safe Search by adding “&safe=active” as a URL parameter and seeing if the results change or not.

 

Upper Limit For Recrawling Pages is Six Months

Google tends to recrawl pages at least once every six months as an upper limit.

 

Check Cached Page to See if Redirect Has Been Picked up by Google

Check if Google has switched the canonical version after a redirect by seeing if the cached version of the page is the target page. You can also use the GSC URL Inspection Tool to check the canonical version.

 

Be the First to Know About the Latest Insights From Google

Hangout notes

Loop me in!

Author

Sam Marsden
Sam Marsden

Sam Marsden is DeepCrawl's SEO & Content Manager. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

 

Tags

Get the knowledge and inspiration you need to build a profitable business - straight to your inbox.

Subscribe today