SEO Office Hours: Recent Session Recaps & Key Takeaways
We’re kicking off the new year with a new set of SEO Office Hours recaps! Read on for our latest top takeaways from recent Google Search Central SEO Office Hours sessions, or browse our full SEO Office Hours library for the topics that interest you most.
About this blog series: Our team here at Deepcrawl regularly attends Google Search Central’s SEO Office Hours events. Even better: to help SEOs and marketers stay informed on the latest developments in organic search and find up-to-date, straight-from-the-source advice from Google’s search experts, we’re sharing our notes.
Server location is not used for geotargeting
John confirmed that Google doesn’t use the server location for geotargeting. A question was asked about whether the server type and location matter and John said there may be a small speed difference in having your server closer to where your users are, but it may not be much of an issue if you have a Content Delivery Network (CDN) in place.
Skip directly to this part of the video below:
Language is evaluated on a per-page basis for SEO
Does an entire website need to be translated to rank well in an alternate language? John responded to a question about whether it would be ok to only translate some pages in a website rather than the entire site. He answered that language is looked at on a per-page basis, rather than evaluating whole parts of a website, so this approach would be fine. He recommended making sure that internal linking is in place to these translated pages so that they can be found.
Skip directly to this part of the video below:
Blocking Googlebot in robots.txt does not affect Adsbot
A participant found that Googlebot was crawling their ad landing pages more than their normal pages. They asked if they could block Googlebot via the robots.txt and if doing so would impact their ad pages. John responded that blocking the ads landing pages for Googlebot is fine but make sure not to block Adsbot as it’s used to perform quality checks on the ads landing pages. He clarified that Adsbot doesn’t follow the normal robots.txt directives and in order to be blocked would require the specific user-agents to be named explicitly in the robots.txt file. Therefore, by just blocking Googlebot as suggested, Adsbot would still have access to those landing pages.
Crawl rate is not affected by a large number of 304 responses
A question was asked about whether a large number of 304 responses could affect crawling. John replied that if a 304 is encountered, it means that Googlebot could reuse that request and crawl something else on the website and that it would not affect the crawl budget. If most pages on a website return a 304, it wouldn’t mean that the crawl rate would be reduced, just that the focus would be on the pages of the website where they see updates happening.
Internal URL changes can cause organic search fluctuations
A participant was seeing organic search fluctuations after a URL structure change on their website, despite adding 301 redirects. They asked if that is expected and how long the process should take. John responded that changing internal URLs means they have to almost reprocess the entire website and understand the context of all the pages on the website first, which can take a significant amount of time.
You are likely to see fluctuations in organic search for at least a month or longer if it’s a bigger change. Fluctuations can also occur if other changes have also happened at the same time, such as internal linking, content, or page structure updates which could have caused the pages to become weaker. If this is the case, John recommended reviewing the pages before and after to understand these differences and which things might need clearing up.
Images should also be redirected during a website migration
John answered a question about organic search fluctuations after a migration. As well as checking the page differences before and after in regards to aspects like internal linking, content or structure, it’s also important to consider embedded content like images.
If you don’t redirect your old image URLs, Google needs to reprocess them again and will find them again as new because they don’t have the connection between the old and the new URL ones. He clarified that it can have a big effect if you have a lot of image search traffic. It makes sense to set up those redirects even if you’ve moved over a month or so ago.
Regularly changing image URLs can impact Image Search
A question was asked about whether query strings for cache validation at the end of image URLs would impact SEO. John replied that it wouldn’t affect SEO but explained that it’s not ideal to regularly change image URLs as images are recrawled and reprocessed less frequently than normal HTML pages.
Regularly changing the image URLs means that it would take Google longer to re-find them and put them in the image index. He specifically mentioned avoiding changing image URLs very frequently, such as adding a session ID or today’s date. In these instances it’s likely they would change more often than Google would reprocess the image URL and would not be indexed. Regular image URL changes should be avoided where possible, if Image Search is important for your website.