Do Not Rely on 3rd Party Cookies to Render Content
Because Chrome is going to block 3rd party cookies, and Google uses Chrome to render pages, if your site is dependent on third party cookies to render a page’s content then it won’t be seen by Google.
Different Rendering Processes are Used When Rendering a Page For Indexing & for Users
Googlebot doesn’t have a specific time when it takes the rendered DOM snapshot used for indexing. The main reason is due to the way Google renders pages, as there are different processes when rendering for indexing compared to when users access a page. This can result in elements on the site being processed differently and it may take longer to render the page for indexing purposes.
Check Cached Page to See if Redirect Has Been Picked up by Google
Check if Google has switched the canonical version after a redirect by seeing if the cached version of the page is the target page. You can also use the GSC URL Inspection Tool to check the canonical version.
Use Chrome DevTools and Google Testing Tools to Review a Page’s Shadow DOM
There are two ways to inspect a page’s shadow DOM in order to compare it to what Googlebot sees. The easiest way is by using the Chrome DevTools, within the inspector you will see # shadow route which you can expand, this will display what the shadow DOM contains. You can also use any of the testing tools and review the rendered DOM, this should contain what was originally in the shadow DOM.
When Changing Frameworks on a Site Ensure You Incrementally Test to Reduce SEO Impact