Hi Simon ...
I think you mean a robots.txt file (as opposed to an .html file), yes? In which case all you require is an open "invitation" as its function is actually for the opposite - which directories/files you don't want crawled.
Syntax:
All robots will spider the domain
User-agent: *
Disallow:
# Disallow directory /cgi-bin/
User-agent: *
Disallow: /cgi-bin/
# Disallow directory /i/
User-agent: *
Disallow: /i/
Where you have text on the links pages, I think you've already ensured that they are crawlable (?!)
cfn ... Jen
Jen Worden
Web Developer
www.meadoworks.com
Bookmarks