Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

Google Webmaster Hangout Notes: Friday 26th February

google webmaster hangouts recaps - SEO tips

Notes from the Google Webmaster Hangout on 2th February, when John Mueller discusses subdomains, disallow rules, crawl rates, header tags, redirects and more.

 

Subdomains for Multiple Sites

Subdomains are treated as separate sites, or part of the main domain depending on the signals.

 

Site Query Results Order Isn’t Specific

The order of search results for site: query in Google isn’t meaningful.

 

HTML Sitemap Pages Help Indexing

If you have a complicated website, providing a mapping of your category pages can help Google to find pages and understand the structure of a website.

 

Multiple Header Tags is OK

It’s fine to use multiple H1, or any other H tags on a page. They can help Google to understand the structure of a page. The hierachy of H tags isn’t important.

 

Keyword Position Affects Weight

The most important keywords should be clearly used in headings, titles and near the top of the page.

 

HTML Bookmarks Shown in SERPs

Sometimes Google will show the HTML bookmarks (e.g.href=”#bookmark”) in the SERPs results next to a page.

 

Multiple Redirects Don’t Lose PageRank

Whilst describing how 301 and 302s are indexed, John seems to suggest that a redirect chain doesn’t pass less PageRank than a single redirect.

 

302 Becomes Permanent When the Target Gains Authority

If you use a 302 redirect, once the target page has other authority signals, the redirect gets treated like a 301.

 

Use Separate Pages per Language

It’s best to use a separate page for each language than combine multiple languages on a single page.

 

Crawl Rate Doesn’t Affect Rankings

Assuming Google is able to pick up your content, there’s no ranking benefit for a page being crawled more frequently.

 

URL Path Structure Isn’t Important

You don’t need to keep consistent URL archtitecture, and include a detailed hierarchy in the URL, with a page for every path.

Google follows URLs which are linked and won’t try other URLs with different paths from the URL.

 

Disallow Rule Must Start with a Slash

If you’re specifying a path in the robots.txt file, you must start with a slash, not a * wildcard.

This was always true, but was only recently added to the documentation and Search Console testing tool.

Avatar image for Sam Marsden
Sam Marsden

SEO & Content Manager

Sam Marsden is Lumar's former SEO & Content Manager and currently Head of SEO at Busuu. Sam speaks regularly at marketing conferences, like SMX and BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox