Robots.txt

From Omnia
Jump to navigation Jump to search

robots.txt

See what pages aren't crawled by Google. Try Google Webmaster Tools.

The Web Robots Pages

# robots.txt for http://www.example.com/

User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
Disallow: /foo.html
# robots.txt for http://www.example.com/

User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space

# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
# go away
User-agent: *
Disallow: /
# allow all
User-agent: *
Disallow:

keywords