Welcome to TalkGraphics.com
Results 1 to 9 of 9

Hybrid View

  1. #1

    Default Re: preventing Google from caching pages

    You can prevent access using a robots.txt placed in the root folder.

    You can create a robots.txt using one of the many free generators on the web
    eg► http://tools.seobook.com/robots-txt/generator/

  2. #2
    Join Date
    Jun 2009
    Location
    Reading. UK
    Posts
    6,995

    Default Re: preventing Google from caching pages

    I hope I am not hijacking the thread.

    I have a 'cache' toggle switch (add-on) for Firefox Mac version, but I cannot seem to find this 'cache toggle' add-on for Firefox Windows version?

    Featured Artist on Xara Xone . May 2011
    . A Shield . My First Tutorial
    . Bottle Cap . My Second Tutorial on Xara Xone

  3. #3

    Default Re: preventing Google from caching pages

    That would not be affective for preventing bots indexing (and caching) your website files on your host, Rik.
    The cache toggle is simply a local (your computer) cache manager for FF.

  4. #4
    Join Date
    Jun 2009
    Location
    Reading. UK
    Posts
    6,995

    Default Re: preventing Google from caching pages

    Thanks Steve.
    I get it.

    Featured Artist on Xara Xone . May 2011
    . A Shield . My First Tutorial
    . Bottle Cap . My Second Tutorial on Xara Xone

  5. #5
    Join Date
    Sep 2000
    Location
    Bracknell, UK
    Posts
    8,659

    Default Re: preventing Google from caching pages

    Megg81, do you have to publish the site to your server to test it?

    Most sites created with vanilla Xara software should be testable on your hard drive.

  6. #6

    Default Re: preventing Google from caching pages

    yea, the site build isnt for me so it has to be uploaded to the server, it will go in a seperate folder for testing which already has a robots.txt with it. What Im concerned about is that I currently have an under construction message displayed for the url - I dont want the site to go live and google still show that as a cached page and take months to fall out of the cache

    on the other hand I dont want a robots.txt file to harm the sites ranking potential once it's removed after the site goes live

  7. #7

    Default Re: preventing Google from caching pages

    Remove the robots.txt when you're ready to go live.

    Alternatively, password protect the folder to allow only yourself and your client to view it. Google won't be able to get to it then.

 

 

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •