Disallow and Google: An Intermediate SEO Guide

Following on from our beginner’s guide to implementing noindex, disallow and nofollow directives, we’re now taking a look at some more advanced methods for controlling Disallow directives in robots.txt.

In this guide for intermediate and advanced SEOs, we’ll cover PageRank, JS/CSS files, indexation, parameters, pattern matching and how search engines will handle conflicting Allow/Disallow rules.

Read more

Noindex, Disallow & Nofollow

The three words above might sound like SEO gobbledegook, but they’re words worth knowing, since understanding how to use them means you can order Googlebot around. Which is fun.

So let’s start with the basics: there are three ways to control which parts of your site search engines will crawl:

  1. Noindex: tells search engines not to include your page(s) in search results.
  2. Disallow: tells them not to crawl your page(s).
  3. Nofollow: tells them not to follow the links on your page.

Read more

Managing robots.txt Changes

Writing and making changes to a robots.txt file can make even the most hardened SEOs a little bit nervous. Just one erroneous character could have a major impact on performance, or even wipe out your entire site.

Read more

Get the latest in SEO

Sign up for the DeepCrawl newsletter and keep up to date with trends, technology and events.