Parameter Handling Improves Crawl Efficiency
Parameter handling prevents URLs being crawled so is better than canonical tags for crawl efficiency.
Google Learns Which URL Parameters Return Irrelevant Pages
Google learns which parameters are returning irrelevant pages partly based on canonicalised URLs.
The ‘Parameter Doesn’t Change Content’ Setting is Similar to Canonical
The ‘Parameter Doesn’t Change Content’ setting is similar to canonical as it will aggregate signals to the cleaned URL.
Excessive URL Parameters and Rewrites Can Causing Problems
Google can have problems crawling your site if your URL structure has an excessive number of URL parameters and rewrites which redirect to a few pages.
Increased Crawl Rates Can Be Caused by Authority, Server Performance or Duplicate Content
If you experience a temporarily high crawl rate, it might be caused because of an increase in authority, or Google thinks the server can handle an increased load, or they might be finding a lot of duplicate content caused by things like URL parameters.
Robots.txt Overrides Parameter Settings
URL Parameter settings in Search Console are a hint for Google, and they will validate them periodically. The Robots.txt disallow overrides the parameter removal, so it’s better to use the parameter tool to consolidate duplicate pages instead of disallow.
Use GL Parameter to See Google SERPs for a Different Country
If you want to see SERPs from a different geo location then change the ���gl�۪ parameter in the URL to the country code.
Googlebot Can Be Redirected to Canonical URLs
John says that althought it’s technically cloaking, it’s actually OK to redirect Googlebot from URLs with tracking parameters to canonical URLs, but allow users not to be redirected, so they can be tracked in analytics.
URLs with Hashbangs Won’t be Indexed
URLs with hashbangs won’t be indexed, so if they are required to produce unique pages, you’ll need to migrate to a traditional URL structure without the hashbang.