How To Disallow Multiple Directories In Your robots.txt File

July 2nd, 2013 - Posted by Steve Marks to Web Development.

I recently needed to hide a couple of directories on a site I was working on from the search engines, to prevent them spidering the files within them. To do this I would use a robots.txt and, after looking around for a quick example, I could see that to disallow a single directory you would do something like so:

User-agent: *
Disallow: /directory1/

This worked fine, however in my case I needed to hide multiple directories from being spidered. My initial thought was that this could be done by separating the directories with a space:

User-agent: *
Disallow: /directory1/ /directory2/ /directory3/

This didn’t work. Time to do a bit more research…

The Solution

It turned out that the correct way to include multiple directories from being spidered is to put each folder reference on a separate line like so:

User-agent: *
Disallow: /directory1/
Disallow: /directory2/
Disallow: /directory3/

This entry was posted on Tuesday, July 2nd, 2013 at 9:45 am by +Steve Marks and is filed under Web Development. You can follow any responses to this entry through the RSS 2.0 feed.

Fear not, we won't publish this

Comments (0)

No comments have been left yet. Be the first