Welcome to TalkGraphics.com
Results 1 to 5 of 5
  1. #1

    Default Stopping search engines from finding pages & folders

    I need to stop *1 page only* in a folder from being found by search engines, but need all the other pages to remain visible - so Im pasting:
    <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
    as a placeholder for an object named <head> on the page which I dont want to be indexed.

    I also need to stop *a different entire sub-folder and ALL the pages within* from being found by search engines.

    Can anyone tell me how to do the second one and if I have the method right for the first one?

  2. #2
    Join Date
    Mar 2009
    Location
    England
    Posts
    2,044

    Default Re: Stopping search engines from finding pages & folders

    The robots meta tag method will not guarantee that your site will not be indexed.

    The only secure method for both requirements is done server side. Usually via PHP or the .htaccess file.

  3. #3

    Default Re: Stopping search engines from finding pages & folders

    Yep, I know but I dont have access to that, so Im implementing a different password method. I need to make sure that the 1 page and the other whole folder aren't indexed though and I need to check my method for the first one and could do with someone helping me out on how to do the second one. Cheers.

  4. #4

    Default Re: Stopping search engines from finding pages & folders

    Im still a little confused even after some further research. Ive been reading this http://www.robotstxt.org/robotstxt.html and the folder I need to disallow access to is a sub-folder of one which needs to remain accessable to the search engines.

    So my directory at the moment looks something like this:

    content
    aspnet_client area1 area2 area3 area4 index_htm_files

    The subfolder I need to disallow access to will be called "clientarea" and will reside within area4 *but* area4 still needs to be allowed, so where do I put the robots.txt just to disallow that one subfoler? Or am I better just to create another folder at the same level as the ones above? (it really makes no difference to me as far as the functionality of the site goes so which ever is more practical).

    And so I create the text file in notepad, save it as robots.txt at then upload it somewhere to the server, but Im not sure I have the code right either...

    User-agent: *
    Disallow: /clientarea

    if I have the folder on the same level as area 1,2,3,4 above.

    and

    User-agent: *
    Disallow: /area4/clientarea

    if I have it as a subfolder?

    Aaaah, so confused where's the emoticon for a large brandy

  5. #5

    Default Re: Stopping search engines from finding pages & folders

    I've drawn my directories wrong, there's another level above, should look like this:

    web
    content
    aspnet_client area1 area2 area3 area4 index_htm_files

 

 

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •