Deepcrawl is now Lumar. Read more.
DeepcrawlはLumarになりました。 詳細はこちら

SEO Office Hours: Late January Session Recaps & Key Takeaways

Key takeaways from recent Google Search Central "Office Hours" sessions. Covering rich snippets, FAQ schema, geolocation, disavow files, AMP pages, and more.

google seo office hours recap for january 2022

Keeping up with the latest in SEO best practices and SEO news is important in our ever-changing discipline. Read the top takeaways from the latest SEO Office Hours sessions from Google Search Central that took place in January 2022. (You can also browse our full Office Hours library to get tips relating to the topics that interest you most.)


 

There’s not a quantifiable difference in how Google views internal links located on different parts of a page. For example, there is no significant contrast in how much value Google gives to internal links contained high up in your content vs. links within the page footer.

It’s different when it comes to the overall placement of content on a page because Google is trying to figure out what is uniquely important to a page’s overall content, but the location of links on a page does not make much of a difference in terms of how that internal linking is valued.

Skip directly to this part of the video below:

 

It’s not possible to control which images appear in rich snippets in the SERPs

It seems that images are increasingly being used in rich snippets shown in the SERPs, but there’s currently no way to tell Google which images are preferred for this purpose. The only option is to use the ‘noimageindex’ meta tag on images that you definitely don’t want to appear in snippets, but note that this particular tag will prevent those images from being indexed entirely.

 

How to use 503 HTTP status codes to prevent pages from being dropped from Google’s index if your site is temporarily down

One user described seeing a loss of pages from the index after a technical issue caused their website to be down for around 14 hours.

John suggests that the best way to safeguard your site against outages like this is to set up a 503 rule ready for when things go wrong. That way, Google will see that the issue is temporary and will come back later to check whether it’s been resolved. Returning a 404 or another error page as the HTTP status code means that Google could interpret the outage as pages being removed permanently, which is why some pages drop so quickly out of the index if a site is down temporarily.

 

Removing AMP pages requires caution but shouldn’t have a direct impact on SEO

John confirmed that, as AMP is not a ranking factor, removing AMP pages shouldn’t have a direct impact on SEO. However, users should consider the potential knock-on effects caused by a drop in page speed (AMP pages tend to be faster than non-AMP ones, but that’s not a guarantee. There’s also nothing to suggest you can’t make regular pages very fast). AMP pages also tend to be crawled a little heavier than non-AMP, so you might also see a rise in crawl activity across the rest of the site.

 

Noindexing pages with geolocation IP redirection is not ideal for SEO

One user asked about the use of geolocation IP redirection in conjunction with noindex. The example was having separate pages targeted at users in multiple locations, but using noindex tags to ensure just one is indexed.

John raised the point that Google typically crawls from one location (mostly using a Californian IP address). If the IP address directs Google to one of the URLs you have set to noindex, it might result in those pages not being indexed full stop. This approach, therefore, isn’t recommended. Instead, you should focus on making location-specific content easier to find once the user has landed on the site.

 

One user asked about managing the size of their site’s disavow file. Only links that could make a user or Google think they’ve been paid for belong in a disavow file. That means that not every instance of a spammy or low-quality link to your site needs to be included in a disavow file. John suggested that a disavow file isn’t necessary for most websites (and having one could be causing more problems than it solves).

 

John confirmed that the size of a website’s robots.txt file has no direct impact on SEO. He does, however, point out that larger files can be more difficult to maintain, which may in turn make it harder to spot errors when they arise.

Keeping your robots.txt file to a manageable size is therefore recommended where possible. John also stated that there’s no SEO benefit to linking to sitemaps from robots.txt. As long as Google can find them, it’s perfectly fine to just submit your sitemaps to GSC (although we should caveat that linking to sitemaps from robots.txt is a good way to ensure that other search engines and crawlers can find them).

 

FAQ schema can be used on select questions

It’s possible to pick and choose which elements on a page to markup with structured data / schema for rich snippets in the SERPs. For example, not every question on an FAQ page needs to be marked up with schema if you see no value in doing so. (John does mention later on that for FAQ schema to be valid, the question must be visible on the page.)


About this blog series: Our team here at Lumar (formerly Deepcrawl) regularly attends Google Search Central’s SEO Office Hours events. Even better: to help SEOs and marketers stay informed on the latest developments in organic search and find up-to-date, straight-from-the-source advice from Google’s search experts, we’re sharing our notes.

Avatar image for Natalie Stubbs
Natalie Stubbs

Senior Technical SEO at Lumar

Natalie is an Senior Technical SEO at Lumar and forms part of our Professional Services team. A fan of all things content-related, she has a passion for helping clients improve their technical SEO by making complex concepts more accessible. Outside of work, you'll usually find her spending quality time with her cat.

Newsletter

Get the best digital marketing & SEO insights, straight to your inbox