Webinar Recap: The Future of SEO with John Mueller & Ashley Berman Hale
For our last webinar of the year, a momentous crossover event occurred. DeepCrawl’s Google Webmaster Hangout notes and monthly webinar series collided and combined in the most wonderful of ways. John Mueller himself joined Ashley Berman Hale live for a webinar to talk about the future of SEO and answer questions from our audience.
It is perfectly understandable that anyone who missed the webinar live might be very disappointed, but don’t despair. We recorded the full session with John and Ashley and took notes on the key points which we’ve included in this blog post, so you won’t miss a thing.
You can watch the full recording here:
We’d like to say a huge thank you to John for taking the time out of his busy schedule to join us, to Ashley for hosting the session, and to everyone who joined us. What a way to close out a big year in search! We’ll see you again in January for more webinars, guests and search insights.
— Ekaterina Budnikov (@KatjaBudnikov) December 12, 2018
Here are the different topics that John covered, if you’d like to dive straight into a particular category:
- John’s Personal Opinions
- Duplicate Content
- Google Search Console & Google Tools
- Structured Data & Markup
- Voice Search
- Local SEO
- HTTP Headers
- Site Speed
John’s Personal Opinions
Which SEO topics do you like talking about most?
The technical topics because they usually have something that you can easily test to identify issues. Quality topics are always more subjective, meaning it’s harder to say what’s going right or wrong.
What would you like SEOs to focus on more in 2019?
The main topics that Google will be focusing on in 2019 are:
- Mobile-first indexing
- Structured data
- The new Google Search Console
Themes @johnmu says @Google is focused on in 2019:
– Mobile-first indexing
– Ways to implement and test structured data
– Search Console re-vamp@DeepCrawl webinar hosted by @BermanHale WOOT WOOT!
— James Wirth (@jameswirth) December 12, 2018
. @JohnMu says SEO’s need to have a better working relationship with developers. Speaking their language, understand their needs, requirements and work more closely with them to accomplish the goals! Happening now on the @DeepCrawl webinar with @BermanHale
— Jennifer Hoffman (@_JHoff) December 12, 2018
What are the most commonly overlooked areas of SEO?
Ashley thinks that internal linking and canonicalisation are the fundamentals that are most often overlooked.
John explained that he usually sees websites that look nice, but it’s very hard to tell what they’re trying to sell just by looking at them. This is one of the biggest SEO issues John sees. Work out what you want your website to rank for and what its purpose is, and make sure that your users can easily recognise that. Take the time to understand what the market is, where your website fits into that and what it realistically has to offer A site shouldn’t just be technically sound, it also needs to include the content it’s trying to rank for.
What should SEOs be focusing on less?
For the most part, SEOs are doing things well. John said that he likes seeing technical tools like DeepCrawl that let people crawl their websites to see how search engines actually see them.
What are your personal goals for communicating with SEOs for 2019?
Google is aiming to communicate more with the industry and hopes to launch more video content, including a short answer series and a topical video series. The overall goal is to communicate more to make sure a larger number of people with specific questions get better answers.
If you were in charge of Search, what would be the first thing you would change & why?
If John was in charge, he would want to make everyone move to Europe to get rid of time zones. This would make his life so much easier and it would be a lot faster for the teams to communicate and solve people’s problems.
John wouldn’t change anything about the algorithms because a lot of really smart people are working on them and doing awesome work.
How does Google handle canonicalised duplicate pages?
If pages are actual duplicates then canonical tags should work fine. If it’s similar content but not duplicate, then Google can assume that the canonical tag is incorrect and both pages will be indexed separately. Google also needs to crawl and index both pages even if they’re canonicalised, so adding canonical tags to duplicate pages doesn’t solve the issue of crawl budget. Remember that Google’s ability to crawl more or less doesn’t affect rankings, however, especially for smaller sites.
Google Search Console & Google Tools
Can you explain the discrepancies in data between Google’s different tools?
Some of these tools, by design, track and collect completely different data, such as Google Search Console and Google Analytics. We can expect some of the data within these tools to merge in time, however. For example, the new version of Google Search Console will combine more reports and data.
Which reports won’t make it into the new Google Search Console?
These discussions are still ongoing within Google’s teams. They need to work out which ones are less useful and require improvement. The preferred domain setting, for example, doesn’t make much sense to John as there should a better way of setting preferences for Google.
How important are user-declared canonicals & how can Google report on these better?
Google’s main focus is moving the existing functionality across from the old Search Console to the new one, so new reports and functionality won’t be added just yet.
John explained that he is always torn on canonicalisation because site owners should know about it in general, but if Google picks a different canonical the page set will still rank exactly the same, especially with normal, small websites.
For Google Search Console moving forwards, there will be more of a focus on presenting issues that are actually causing problems for a website, rather than showing every possible issue as this isn’t always helpful. This is to help SEOs prioritise the data we need to act on.
Excited to hear @SearchConsole is leaning more towards providing data & dashboards with clear “act here” information rather than soft indicators of things to fix i.e. hundreds of 404 pages. #progress #SEO @DeepCrawl #webinar @JohnMu
— Holly Miller | Marketing Strategy Consultant (@millertime_baby) December 12, 2018
Will Google Search Console include more reports on rich results, videos & voice search?
John said that he could imagine this happening. These are areas where the data doesn’t explicitly need to be hidden, but Google needs to make sure it is useful for website owners. Voice search data, for example, isn’t really that useful yet because not many people are using it in a meaningful way for SEO.
If you were a publisher & could only access one Google Search Console report, which would you choose?
The Performance report.
Will there be more integrations with other platforms like WordPress?
Yes, there will be more. The idea is to bring more Google platform data to the places where people are creating content. SEOs and digital marketers are doing things differently nowadays and not everyone uses tools like Google Search Console, so it makes sense to bring the data to them.
Will we see more collaboration & cohesion between AdSense and Search recommendations?
There is a strict separation between the search and ad sides to make sure that search stays as neutral as possible. People working on the ad side have to look at public documentation like any regular user, they don’t get any special treatment just because they work within Google too.
Structured Data & Markup
What will structured data look like in 2019?
Structured data is used in two main ways. Firstly, for specific SERP features, but it’s hard to say what particular SERP features are coming up, therefore, it’s hard to say which kinds of structured data will be most prominent in 2019.
Secondly, Google uses structured data to better understand websites and their content. Google will continue working on communicating structured data importance over time.
Website owners should focus on marking up the primary information they want to get across and avoid wasting time adding all the different kinds of markup as this can waste time and may not result in ROI. Also, prepare for any upcoming structured data developments by implementing markup now to get ahead of what Google will launch.
What is the best practice for images within product markup?
You don’t need to have a 50-megapixel image of every single product you have. Make sure the maximum image sizes are limited to what a user would need.
Does Google globally support speakable markup & can we get more information on global releases?
A lot of features rollout in the US in English first as this is easiest for the Google teams to analyse the data quality and fine-tune their algorithms. Once this has been done, they can roll out features in other countries. However, local policies and laws within certain countries make it harder to roll out new features.
The teams at Google have discussed providing more information on what’s available in which countries with regard to structured data. More information will be added to the developer documentation around this and, in future, an attribute will be added to different structured data types specifying whether they’re globally supported or not.
Google also want to encourage people to add markup even if it’s not being used in their country yet, as when it is available then your website will be one of the first to be shown for new features.
How should SEOs talk to their clients about voice search?
Home assistants are becoming more popular but voice search is still fairly new and it is rare that people will ask something where a specific search result will be read out to them. Things like Q&A markup which was recently launched are heading in the direction of optimising for voice search, so this could be a useful feature to work on if you have Q&A content and want to work more on voice search. In general, not every website will need to optimise for voice.
Does the value of inbound links depreciate over time?
If you work on creating links to a website, you’ll probably end up with links that are tied to a point in time, particularly if you’ve run a particular PR or news campaign. This will drive traffic initially, but news articles come and go. Over time, news articles will more or less disappear and become less relevant, meaning the value of its outbound links will decrease too. A link from an old news article won’t be as useful as one from a website which is recommending you in a more prominent place within their site’s architecture. There isn’t any kind of timeout for the value of links over time though.
How should publishers handle sites that have services in multiple cities?
John recommends having information about these locations listed on each of their respective pages. Also, make sure you have Google My Business listings for each of them so this content can be shown for each of the locations. Don’t place ‘near me’ in your title, this is becoming obsolete.
Ashley explained that the Google My Business portal has been much more helpful recently, but we need to be logging in regularly and proactively managing it because new features are being launched all the time so you need to stay up to date.
If a website has to use geographic redirection, what’s the best way to handle this for Google?
John recommends treating Googlebot like a regular user from the same region. If you’re sending users from the US to US content, then send Googlebot there too. Be aware that Google may not be able to pick up any of the copy that’s targeted to locations outside of the US though. If you’re based in Europe and have to block the US, then you’re at risk of none of your content being indexed. Try and make at least some of this content available and indexable for Google in the US. Start with information that is generic enough that it doesn’t have any legal implications from being made available, and make sure this content provides enough value for users.
There is no fixed time for this.
Yes, you’ll see these guidelines changing over time. This will happen naturally as the web continues evolving.
Are last-modified headers still taken into account?
Does Google crawl using the HTTP/2 protocol?
John doesn’t believe so, but from a practical point of view, this doesn’t make a difference for search because Googlebot accesses pages like any normal browser.
How important is it to analyse log files for SEO?
This depends on the size of your website. This isn’t very important for small sites, but people managing large websites should look at log files.
Ashley strongly recommends diving into log file analysis. To do this with DeepCrawl, for example, you can utilise our log file integrations to layer your website crawl data with log file data for useful insights.
How will Google better measure runtime rather than just load time?
Google doesn’t know what to click on or interact with to be able to interact with websites and measure runtime in this manner. John recommends looking at pages per session and conversion metrics to judge how users are able to interact with your website once your website has loaded, as this will give an indication of user experience and how many people stick around after load time.
Is page speed ranking analysed for a site as a whole or on a page-by-page basis?
Google analyses both in that it tries to assess the state of a website overall, but can also be more granular and treat parts of a site differently when they are seen as separate sections. John suggests identifying hot spots on your site where most of your users are landing which are slow, and fix those as a priority.
What’s the best way to improve site speed?
John really likes how web.dev integrates Lighthouse audits to track site performance. This is still a fairly new feature but it’s a nice place to get this kind of information. The animation of how websites load in webpagetest.org is also very useful as it visibly demonstrates load time and how it affects users. This will show you whether or not the primary content is loading first and how much useful content is being displayed above-the-fold for users and how quickly.
How do you see AMP & mobile sites being further integrated in 2019?
We’ll probably see more sites that are pure AMP or parts of sites that are pure AMP because it is easier to have just one URL from a website management perspective, rather than having separate AMP and mobile pages. John also sees the AMP teams moving towards making this technology available for more normal mobile web pages.
John’s closing thoughts
When asked for his closing thoughts to wrap up the webinar, John thanked the DeepCrawl team for taking the time to take notes on all of the Google Office Hour Hangouts. He said that the team at Google really appreciates this because our work helps to make sure that the most important points don’t get lost for website owners.
You’re very welcome, John! We’re more than happy to help out and make sure as many people can make use of the points from your hangouts as possible.
More insights from John Mueller in our Webmaster Hangout notes
If you want to hear more Google insights straight from John Mueller himself, then make sure you subscribe to DeepCrawl’s newsletter where we’ll send you the key takeaways of the latest Google Webmaster Hangouts whenever they happen, straight to your inbox.