The Author
Sam Marsden

Sam Marsden is DeepCrawl's SEO & Content Manager. Sam speaks regularly at marketing conferences, like BrightonSEO, and is a contributor to industry publications such as Search Engine Journal and State of Digital.

Read more from Sam Marsden

After the success of last year’s Q&A with John Mueller, this April’s BrightonSEO saw John return to the main stage to provide some enlightening answers to questions from the always entertaining Hannah Smith of Verve Search.

These sessions with John are always a real treat for the DeepCrawl team, as we diligently take notes on all of the Google Webmaster Central Hangouts, so it was our pleasure to write up John and Hannah’s Q&A in this recap.

If you want to catch up on any of the talks from earlier in the day, take a look at Part 1 and Part 2 of our BrightonSEO April 2019 event recaps.

What are your responsibilities as a Webmaster Trends Analyst?

John and his team try to connect Google’s internal search teams with people externally who are making great websites. The team tries to make sure that information flows in both directions so that Google’s updates and developments are communicated quickly, but also so that their engineers receive valuable feedback. Google engineers tend to focus on their own projects and don’t engage with the SEO community, so John and the team are responsible for bridging that gap and guiding information back and forth.

What does success look like for John and the team?

John’s personal success metric is not to get fired, which he has managed so far. On a serious note, John wants to make sure webmasters, publishers, and digital content providers are happy with the deal that they have with Google, where they crawl websites and send them traffic in return. John and the team come out with some crazy metrics to gauge how satisfied webmasters are with Google, but this isn’t really measurable.

Is there anything in your role that you’d like to do differently?

It’s tricky because John and the team try to be as direct as possible and work with the internal search teams as closely as they can so that they can inform webmasters about changes ahead of time.

One area where the team has struggled is making sure that they are in the right place at the right time internally so that when changes are planned this can be communicated externally.

Are there enough people at Google with the same responsibilities as John?

John thinks there are definitely not enough people in his team and he would love to have more people at Google talking with webmasters and publishers, as there are so many questions out there.

John was speaking all day with BrightonSEO attendees and they all have unique questions and the Webmaster Central Hangouts continue to raise new questions as well. John is hoping that they can get more people on the team talking with the community.

How do you think your role might change a year from now?

It’s difficult to say because it depends on what is happening internally at Google. One shift John is noticing is that there is an increasing split between technical and non-technical SEOs. John doesn’t think that either of these areas of SEO is more or less valuable, but John has noticed that he is now talking to more and more front-end developers as well as SEOs.

On the one hand, you have some SEOs that think it isn’t their job to work with developers, but on the other hand, there are so many developers making basic SEO mistakes that John thinks it is important for Google to help SEOs bridge that gap. John foresees that they may split out their work and resources between technical SEO and online marketing going forward.

What do Google’s internal PR team think about what you do?

John thinks they have a difficult role because PR teams at larger companies like to prepare messaging in advance, where you have pre-prepared answers when someone asks a particular question. However, John and the team are often in situations where they get asked about something new that has come out and they need to provide some sort of answer.

Sometimes the answer that John and co. give isn’t exactly the same as what the PR team has prepared or thought they would prepare. There are times when they don’t know the exact answer or don’t know how they should frame an answer in a way that the PR team would be ok with. This results in some interesting email exchanges. However, John thinks they work fairly well together on the whole.

What are the search quality and engineering teams trying to achieve and how is that measured?

This is something that changes over time, there is no one clear metric. The metrics used to measure success change along with user expectations.

The main goal of search is to make information accessible, useful and relevant. For regions or languages where there is little content, measuring whether or not Google even has an answer could be a metric. If a suitable answer can’t be provided then that is seen as a failure.

The next step on from providing any answer is making sure answers are correct. This is particularly important for medical content because it is critical that Google is providing the correct answer.

How do you know if Google has done a good job of ranking results for transactional queries?

A lot of the time they don’t know as there is no one right answer and there are multiple answers that could be the first result. In cases like that, it’s important for Google to have those 10 blue links on the search results page because Google doesn’t completely know what the searcher is looking for.

John doesn’t think they would use user signals from SERPs to directly rank pages. Google does leverage user signals to evaluate its search engine algorithms across large sets of queries. However, user signals aren’t used for one or a handful of queries because it could go in so many different directions.

What’s Google’s view of people’s complaints that on-SERP features like featured snippets are stealing clicks from websites?

The web evolves and so do users’ expectations, so if your website is built on the model of “I have this one line of text on a page and I want to visitors to click on an ad and leave” it might be that over time Google realises that there is an objective answer that they can show directly in search e.g. “what is the speed of light?”

On-SERP features have value for the websites that are featured in them because they highlight the website and they send traffic. The information shown within these on-SERP features isn’t always complete and a searcher can go to the webpage for more information.

There are a lot instances where featured snippets provide the searcher with information that they would have got from the site anyway. For example, a business doesn’t need a pageview to communicate its opening hours and make searchers happy. Google isn’t taking anything away in situations like this, they are just helping the searcher get to the information a bit faster.

On 21st March, Google announced that rel=”next” and rel=”prev” were being retired. Can you tell us what happened there?

Google stopped using rel=”next” and rel=”prev” annotations a while ago but it wasn’t raised internally. Recently someone asked about why these annotations weren’t working, so John and the team looked into it and learnt that they hadn’t been using it for some time. That was kind of awkward.

John and the team decided to put out a blog post and explain that this annotation is no longer used, even though it wasn’t an announcement as such, and removed the documentation.
Between meetings, John put out a short tweet explaining the development and Gary Illyes followed up with him saying that he didn’t communicate this in the best way.

John thinks he could have dealt with the change a bit better, but they want to make it clear that there is no need to panic, and if you don’t remove these annotations your website won’t explode. They also understand that it’s awkward for the community as people have spent a lot of time implementing these annotations on pages.

Many news sites are now nofollowing external links on a section-wide and even sitewide basis. How will this impact the link graph?

This definitely does impact Google’s link graph and their ability to pick up new and fresh content which is a shame and not necessary. News sites could go about this in a way that makes more sense for the web, where authors research a topic and properly link to the source.

Looking at it from the other side, though, John understands that news sites might see it as a hassle having to deal with all of these SEOs messaging them to add links and why they might decide to nofollow all external links. It is a bit shortsighted thinking though because links play a role in finding useful content on the web.

What happened with the technical issue last week that caused many pages to be removed from Google’s index?

It was purely a technical issue on Google’s side, but it wasn’t caused by an update. There was a technical issue impacting the index, then while fixing it something else broke. Google tries to make sure that when something breaks it’s not in a visible way, but sometimes it is.

John confirmed that the issue has now been fixed, but if a site’s URLs still aren’t indexed then you can submit them to the index in Search Console on a page-by-page basis or submit groups of URLs via sitemaps.

Side note: Google’s internal alert system is named “OMG :)”.

What challenges is Google facing when it comes to reporting on voice search?

Voice search is a really tricky topic because there are so many things that could be encompassed underneath it. Do voice commands like setting a timer count as a voice search? Does tapping on the microphone button on your phone and asking a question count as a voice search? If you don’t know exactly what you want to measure then you won’t get numbers that are useful.

John recommends buying an assistant device to get an understanding of voice interactions and their limitations, so you can see where your site might fit into this. It might be that your content covers a topic which is complex enough that it can’t be answered by an assistant and requires you to read a webpage to understand what it’s about.

John doesn’t know if they can capture all voice searches, but questions what you would be able to do with that information anyway. How would this change how you optimise webpages? John believes that as long as your site’s content is useful and accessible for search then you won’t need to do anything special for voice search.

Hangout notes signup

More insights from John Mueller in our Webmaster Hangout Notes

Thank you again to John and Hannah for putting on such a fun and informative session. If you want to hear more Google insights straight from John Mueller himself, then make sure you subscribe to DeepCrawl’s newsletter where we’ll send you the key takeaways of the latest Google Webmaster Hangouts whenever they happen, straight to your inbox.

Sign me up to the newsletter