View Single Post
Old 03-02-2013, 03:14 AM   #1
Join Date: Aug 2009
Posts: 88
Default Robots.txt problems

I have updated a website and in doing so ( I want to have ) removed
folders/directories and pages/URLS

the tutorials on this forums pages say one thing, examples and Google seem at odds!

Disallow: /XXXXXX (a folder!)
Disallow: /XXXXX.html
is one example

The next is
Disallow: /XXXXXX/
Disallow: /~XXXXX.html

this adds / and ~

As it is after 4 hours of re-writing different txt files and running the Google check in Webtools - each time it reports...Syntax not understood.

Also I have seen User-Agent* and User-Agent: am I missing something??

What do I need to do to block folders entirely and pages within folders

vinnyvangogh is offline