Hi.

I design horse racing sites for people that sell ebooks along with allowing people to sign up, via webforms, to the service. I obviously need to protect certain pages from being accessed by people who haven't paid, so I currently edit the 'Robots.txt' file to exclude google from indexing my protected pages. I have recently learned that you can simply see the hidden pages by following:

www.MyHorseRacingSite.com/Robots.txt

This approach is obviously flawed, so I read another approach which involves simply using the 'Robots.txt' to exclude a folder called 'NoRobots' and then you would put all your protected pages in there. The only question is how to get XWDP7 to work with me on this.

What I want is to be able to specify, in Web Properties, that specific pages are to be uploaded to my 'NoRobots' sub-folder. The 'page filename' field does not allow '/' to be entered so this is not possible. It would also be extremely helpful if the navigation bars and internal linking system of XWDP7 could recognise certain pages & links where in sub-folders.

Thanks,

Ian.