Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

BrightonSEO April 2018 – Keynote Recap – John Mueller Q&A

BrightonSEO event recap - seo industry conference
Aleyda Solis & John Mueller during the BrightonSEO April 2018 keynote

BrightonSEO April 2018 ended with a very special keynote. It was especially exciting for the DeepCrawl team who diligently take notes on all the Google Webmaster Central Hangouts.

Google’s Webmaster Trends Analyst, John Mueller, stuck around after a full day of meeting SEOs and answering their questions to take part in a Q&A session with Aleyda Solis, one of our brilliant CAB members.

Hey #BrightonSEO folks – it was great talking with you all today! Thanks for a fascinating day!

— John ☆.o(≧▽≦)o.☆ (@JohnMu) April 27, 2018

In case you weren’t able to attend or would like to revisit what was covered in the Q&A, we’ve written it all up in this recap.

If you want to catch up on any of the talks earlier in the day or other BrightonSEO news, take a look at Part 1, Part 2 and Part 3 of our event recaps.

 

The content from the mobile version of a page will be used for indexing when mobile-first rolls out

John explained that mobile and desktop versions of content aren’t both being indexed side by side where Google will be able to compare content differences in that way. At the moment the mobile version of a page is looked at for mobile signals such as interstitials and mobile-friendliness but it is not indexed, the desktop version is indexed.

Having content differences will mean that only the content on the mobile version of the page will be indexed when mobile-first rolls out. So if you have content that appears on desktop but not on mobile, this won’t be indexed.

 

If you want to know more about mobile-first indexing, be sure to read our comprehensive white paper.

Download our mobile-first white paper

 

Content hidden in tabs won’t be an issue for Google with mobile-first indexing

Having content in expandable tabs provides the best user experience on mobile so Google wants these elements to be a viable option for site owners. There won’t be any disadvantage for anyone who uses them.

 

 

Low mobile traffic is often caused by the site not being optimised for mobile devices

John gets asked a lot whether a site with low mobile traffic should bother with mobile optimisation. The Google Search Console team even asked him whether they need to make a version of the tool for mobile. During the Q&A he posed the idea that sites see low traffic levels on mobile because they aren’t optimised for mobile.

Some sites naturally receive less traffic on mobile, however, John went on to raise the point that people are increasingly using phones for more and more activities. He suggested that it is a bad idea to disregard mobile optimisation just because you perceive that your audience isn’t visiting your site on that device currently.

 

 

Google employees still have a good overview of the bigger picture of rankings and algorithms, AI hasn’t taken over

Google takes care to foster an understanding of how search works within the bigger picture for its engineers and employees, as it’s important for each team to understand how changes to their specialism (i.e. crawling) will impact the wider teams and how everything is connected together.

It takes teams of humans to assess risks and impacts, as was the case with mobile-first indexing which has been approached cautiously. As John explains:

“It’s not the case that Google is this autonomous machine just working on and improving the algorithms and changing things but nobody really knows what is happening. […] There are lots of smart people involved that do pay attention.”

 

 

There are plans to expand Google Search Console to provide more information on aspects like voice queries

Google is planning on expanding the information that you can currently see in Google Search Console. One of these possible features is to potentially include more information around voice queries due to high demand from the SEO community.

A lot of the upcoming features to be released will be based around the API, however. Google want Search Console to be accessible for the less experienced user as well so they don’t want to provide too much detail that won’t be useful. The newer features will have API access so that the expert end of the audience can still get value out of the tool and use that data in the way that best suits them.

 

 

All the old Google Search Console reports will be moved across or consolidated in the new version

All the existing reports from the old Google Search Console will be moved across to the new version, as John mentioned in a previous Webmaster Hangout. Google wants to avoid blindly taking everything from the old UI and dumping it in the new one, a more thoughtful approach is required.

With the new Search Console comes an opportunity to improve and consolidate reports to make the new version as useful and actionable for users as possible. John estimates that this process will take around a year to be completed.

 

 

Google uses a variety of data sources when analysing page speed, including real world data

In order to assess site speed and how to treat different sites, Google uses a variety of data sources when analysing page speed. Real world data, as found in PageSpeed Insights and the Chrome Usability Report, is one of these sources.

This is because it is incredibly difficult to find one page speed metric that applies to all sites and user experiences as they are all unique. This is something that Google is still fine-tuning.

 

 

Top stories is an organic search feature, you don’t have to have AMP in order to appear there

The top stories feature is determined algorithmically and organically, so you don’t have to have AMP for your site to appear there. This is something John previously covered in a Webmaster Hangout.

A variety of different signals are taken into account when deciding which sites to show within top stories, there’s not one single thing you can implement such as AMP or a meta tag, for example.

 

 

Robots.txt is your biggest weapon when optimising crawl budget

To optimise crawl budget for a website (for example, one with an infinite number of search pages), John recommends using the robots.txt file. This is the most definitive way to communicate to Google not to crawl something. The robots.txt file is also a great solution for steering Googlebot away from resource-intensive areas of your site.

For other situations, John recommends using a combination of noindex, nofollow or other traditional methods for optimising crawl budget and conveying what you don’t want to have indexed.

 

 

The disavow tool is sometimes incorrectly referred to as ‘an admission of guilt’. This is not the case, it is simply a technical tool. As John explained, disavowing is a great way to deal with manual actions or to make sure that you aren’t associated with particular manipulative linking activities that you spot coming into your site, such as a previous SEO buying links in the past or getting involved with PBNs.

John also advises not to perform regular link audits as Google is well-equipped to understand and ignore unnatural linking.

 

 

PageRank Sculpting is a waste of time

Google looks at a site overall, so spending time sculpting your internal linking architecture to direct authority to certain pages is, in John’s opinion, a waste of time. Especially if you are using overly complicated methods. You need to assess how much complication you’re adding to your process (and potentially your site), and how little value you will be getting back from this kind of activity.

 

 

Google isn’t male or female – people & search engines shouldn’t be put into buckets or be labelled

Towards the end of the Q&A, John gave us some quite profound and inspirational advice! When someone asked whether Google was male or female, John replied that Google can be anything it wants to be and it shouldn’t be labelled or put into a bucket, in the same way that people shouldn’t. He also went on to say that we should all be ourselves. Wow, John… That was beautiful.

I’m not crying, you’re crying.

 

 

SEO will stick around for a while yet, but who knows what the future holds as AI continues developing

When asked if he agreed with the adopted slogan of BrightonSEO April 2018, “SEO will never die”, John had us all worried for a moment. He started off hesitantly, posing the question of where AI will take us all in the future.

However, he turned it all around for us and assured us that SEO isn’t going anywhere for the foreseeable future. This is because there are aspects of SEO that continue to be necessary despite how technology continues to evolve. These aspects include the technical side of SEO, the promotional side of organic search, as well as how we provide content and help search engines to understand it.

 

Well said, John! I guess no one’s getting rid of the SEO community just yet.

SEO will never die

Thanks for reading our BrightonSEO April 2018 keynote recap! We hope you enjoyed the insights from John Mueller and the other sessions covered throughout our 4-part event recap series.

If you want to continue learning from the experts about the latest industry developments, make sure you read our webinar recap with Jon Myers and Bastian Grimm on the new era of mobile-first indexing.

View the Mobile-First webinar recap

Avatar image for Rachel Costello
Rachel Costello

SEO & Content Manager

Rachel Costello is a former Technical SEO & Content Manager at Lumar. You'll most often find her writing and speaking about all things SEO.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox