View Single Post
Old 03-02-2013, 03:14 AM   #1
vinnyvangogh
 
Join Date: Aug 2009
Posts: 84
Default Robots.txt problems

I have updated a website and in doing so ( I want to have ) removed
folders/directories and pages/URLS

the tutorials on this forums pages say one thing, examples and Google seem at odds!

Sitemap: http://www.xxxxx.com/sitemap.xml
User-Agent:*
Disallow: /XXXXXX (a folder!)
Disallow: /XXXXX.html
is one example


The next is
User-Agent:*
Disallow: /XXXXXX/
Disallow: /~XXXXX.html

this adds / and ~


As it is after 4 hours of re-writing different txt files and running the Google check in Webtools - each time it reports...Syntax not understood.

Also I have seen User-Agent* and User-Agent: am I missing something??

What do I need to do to block folders entirely and pages within folders

TIA
vinnyvangogh is offline