Recovering from a Panda penalty with DeepCrawl
Making Sure That Your Low Quality Pages Are Removed or Improved
An SEO’s work is never done… just when you thought you were finished with Penguin audits, backlink audits
and endless rounds of testing, it’s time to catch-up with Panda and make sure no thin or poor quality
content has appeared on your sites.
Pair Your Analytics Data with Your Crawl Data to Establish the Reason for Low Engagement
By using analytics data you can use the site explorer (tree diagram) functionality to browse and determine
if the reason your pages have low engagement is due to thin or even empty content.
Advanced Duplicate Detection Methods
Unlike conventional crawlers, DeepCrawl provides you with advanced duplicate detection methods, looking at
your duplicate meta titles and descriptions and your similar content. Based on DeepRank, our internal
ranking system, we highlight primary vs duplicate pages. In other words, the pages you want indexed vs
the pages that you want to canonicalize from.
For a high level view to see all the panda related challenges an SEO may experience, we built a dashboard
in which you can monitor all (1) titles and descriptions; (2) body content and; (3) social tagging related
Determine what pages are indexable and which pages are not. Evalauate the URLs found in the primary pages
report. Identify your bad primary pages e.g. high bounce rates or otherwise limited relevant content,
allowing you to figure out if those pages should be made non-indexable, or removed altogether.