Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: August 2nd 2017

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on the 2nd of August, 2017.

 

Google Attempts to Map Non-Unicode Fonts to Unicode Version

Google tries to recognise instances where non-unicode fonts are used and attempts to map them internally to a unicode version of that page so they can index one page. This is the case for the most popular font in Burma. However, Google struggles to deal with non-unicode fonts when entered in search.

 

Quality Raters Improve Algorithms Rather Than Evaluate Individual Sites

Quality raters are, for the most part, looking for things to do with specific parts of algorithms not necessarily flagging sites with regards to reputation.

 

Google Doesn’t Use Direct Traffic as a Ranking Signal

Both John Mueller and Matt Cutts recommend building sites that users want to return to directly and recommend on their own. However, Google doesn’t explicitly look for these behaviours through various signals.

 

Duplicate Content On Its Own Doesn’t Mean That Site is Low Quality

A website should be able to stand on its own and somewhere where users go to specifically to find content. This usually means providing unique content but the presence of duplicate content doesn’t make a website low quality.

 

Google Does Look at Overall Quality of Websites

Google does occasionally look at the overall quality of websites which are made up of the quality of the individual pages on that site, so if a large proportion of pages are low quality, that will reflect how Google sees the website as a whole. You can noindex low quality content but best solution is to improve on it.

 

Use Same Hreflang Markup On Separate Mobile as on Desktop Site

For mobile pages, Google recommends having the same hreflang markup as on desktop pages – they should be equivalent in terms of metadata and content.

 

Disallowed Pages May Take Time to be Dropped From Index

Disallowed pages may take a while to be dropped from the index if aren’t crawled very frequently. For critical issues, you can temporarily remove URLs from search results using Search Console.

 

Google Crawls Forms On Reference Sites to Discover Content

Google has ways of crawling forms on reference/document repository type sites as it’s important that they can crawl results which may include individual pages or documents. However, Google tends to avoid crawling through “calculator” forms to find content as expects that content following form will be found elsewhere on the site e.g. insurance rates forms.

 

Google Doesn’t Directly Measure Site’s Customer Satisfaction

Google doesn’t directly measure customer satisfaction of different sites but will see if no one is recommending and infer that maybe it isn’t that relevant for related search queries.

 

Google is Expanding Search Console Beta Group

Google are looking for sites to be added to the Search Console beta group who have issues that might be solvable with the new version.

 

Updating of HTML Improvements is Dependent on Crawl Rate

Updating of HTML Improvements in Search Console is dependent on crawl frequency of the website. John recommends using this report to get an idea of what you could be working on your site but not as a checklist you need to get down to zero issues.

 

Search Console Reports Are Restricted to Only Show Indexed Pages Receiving Impressions

Search Console makes some simplifications in its reporting giving relevant sample of the pages Google has indexed. They do this because most sites have many URLs that are indexed but never shown in search. John recommends not to use the counts in Search Console Reports as an absolute count of all of the pages that are indexed but look more at how issues trend over time.

 

New Search Console is Likely to Include Data and UI Updates

New version of Search Console will be a mix of data and UI updates. Google have been looking at including different data as well as the amount of data.

 

Fetch & Render Won’t Necessarily Render Whole Page

The Fetch and Render tool has a cut off and won’t necessarily render the whole page. There is also a cut off for indexing, but this is fairly large (~100mb).

 

Site: Search Operator Isn’t True Indicator of All Indexed Pages

Site: search operator isn’t a true indicator of all pages that are indexed on that site. Use a sitemap file to submit the URLs you care about.

 

Google Limits Crawl Frequency of Slow Loading Pages

This is done because Google doesn’t want to spend a long time accessing slow to load pages and so as not to put more of a load on an already struggling server.

Avatar image for Sam Marsden
Sam Marsden

SEO & Content Manager

Sam Marsden is Lumar's former SEO & Content Manager and currently Head of SEO at Busuu. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox