Upload Specific Web Pages to SubFolders
Hi.
I design horse racing sites for people that sell ebooks along with allowing people to sign up, via webforms, to the service. I obviously need to protect certain pages from being accessed by people who haven't paid, so I currently edit the 'Robots.txt' file to exclude google from indexing my protected pages. I have recently learned that you can simply see the hidden pages by following:
www.MyHorseRacingSite.com/Robots.txt
This approach is obviously flawed, so I read another approach which involves simply using the 'Robots.txt' to exclude a folder called 'NoRobots' and then you would put all your protected pages in there. The only question is how to get XWDP7 to work with me on this.
What I want is to be able to specify, in Web Properties, that specific pages are to be uploaded to my 'NoRobots' sub-folder. The 'page filename' field does not allow '/' to be entered so this is not possible. It would also be extremely helpful if the navigation bars and internal linking system of XWDP7 could recognise certain pages & links where in sub-folders.
Thanks,
Ian.
Re: Upload Specific Web Pages to SubFolders
First of all, robots.txt does not guarantee anything. It is advisory information which may be easily ignored by bots.
Second - there's no way to export parts of a single website to different folders. You have to create separate designs for content of every folder and then export them to the correct folder.
Re: Upload Specific Web Pages to SubFolders
Each directory of the site needs its own xara design file. In the publishing options, you would designate the same website, but put in a specific folder. So one website could have 20 different design files to build it if you need 20 different subdirectories.
site.com/gallery
site.com/2010
site.com/2011
site.com/whatever
site.com/
Re: Upload Specific Web Pages to SubFolders
Quote:
Originally Posted by
samrc
Each directory of the site needs its own xara design file. In the publishing options, you would designate the same website, but put in a specific folder. So one website could have 20 different design files to build it if you need 20 different subdirectories.
site.com/gallery
site.com/2010
site.com/2011
site.com/whatever
site.com/
Thanks Samantha. That's a really good idea that will work.
Ian.
Re: Upload Specific Web Pages to SubFolders
There is a product I use Encrypt Web Pro. It is not a perfect solution but it allows you to disable a lot of the common methods for copying material, e.g. you cannot copy or download the images. You can set it to protect text. You can have it prevent other sites from linking to yours.
As I say, it is not a perfect solution because if someone really wants to copy your material he will figure out a way around. But for most of the honest people, it will protect the content of your site and you don't have to create a bunch of different web sites.
Re: Upload Specific Web Pages to SubFolders
I used the multiple subdirectory, multiple design file concept on a very large commercial website years ago, grouping products by subdirectory.
Made editing, and publishing a breeze.
Main directory had site map showing all subdirectories and each subdirectory had a master page with directory to individual pages and cross linked to other directories.
Site was very easy to navigate and still gets #1 rankings for every product, every keyword and key phrase.
Customer was no more than 2 clicks from anything.
Got lots of compliments on the organization and ease of use.
Any page you do not want indexed should carry the head code
Quote:
<META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW">
or
<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
That is a polite instruction to search engines to not index/copy that page.
You could consider password protecting a subdirectory, assigning each person their own login. If you are on Linux server, it is easily done without scripts with the .htaccess file
Read tutorial here: http://www.javascriptkit.com/howto/htaccess.shtml
Keeps search engines OUT.
The commercial site I mentioned before used .htaccess file to permit information to be stored in folders for the sales reps. Each rep had a private folder requiring login to see stats, reports, etc.
Never linked from within the website, they were not crawled by search engines.
If you want a better "member" solution, you could consider a script like these:
Compare these.....each have different options to lock out unregistered people, and allow only registered members into the private area.
h**p://www.amember.com/p/Main/Features
h**p://www.wildapricot.com/pricing.aspx (has a free account for 50 or less members)
h**ps://www.avectra.com/eweb/DynamicPage.aspx?webcode=ProductMatrix
h**p://www.yourmembership.com/pricing/
h**p://www.comarch.com/en/industries/trade/products/cim/loyalty?gclid=COmnwPDKwZgCFQxKGgodWTf_1A
BUT anyone who wants to copy the page once they are in the private area can do it in some form or fashion. At minimum, screen print, at most, using a site crawler.
Anything put in cyber space has the potential for being stolen.
I agree with Gary. If someone really wants to copy your material, he will.