Welcome to TalkGraphics.com
Results 1 to 9 of 9
  1. #1

    Default preventing Google from caching pages

    When a site is being worked on and you have a page up just to say under construction - is there a way to prevent Google from caching this page? I know Google says that when pages are changed they eventually get removed from the cache when it next crawls the site, but often this is weeks and I really dont want this page to remain available after the site is published. Any tips? robots.txt? or will this potentially harm the site ranking once published?

  2. #2

    Default Re: preventing Google from caching pages

    You can prevent access using a robots.txt placed in the root folder.

    You can create a robots.txt using one of the many free generators on the web
    eg► http://tools.seobook.com/robots-txt/generator/

  3. #3
    Join Date
    Jun 2009
    Location
    Reading. UK
    Posts
    6,970

    Default Re: preventing Google from caching pages

    I hope I am not hijacking the thread.

    I have a 'cache' toggle switch (add-on) for Firefox Mac version, but I cannot seem to find this 'cache toggle' add-on for Firefox Windows version?

    Featured Artist on Xara Xone . May 2011
    . A Shield . My First Tutorial
    . Bottle Cap . My Second Tutorial on Xara Xone

  4. #4

    Default Re: preventing Google from caching pages

    That would not be affective for preventing bots indexing (and caching) your website files on your host, Rik.
    The cache toggle is simply a local (your computer) cache manager for FF.

  5. #5
    Join Date
    Jun 2009
    Location
    Reading. UK
    Posts
    6,970

    Default Re: preventing Google from caching pages

    Thanks Steve.
    I get it.

    Featured Artist on Xara Xone . May 2011
    . A Shield . My First Tutorial
    . Bottle Cap . My Second Tutorial on Xara Xone

  6. #6
    Join Date
    Sep 2000
    Location
    Bracknell, UK
    Posts
    8,659

    Default Re: preventing Google from caching pages

    Megg81, do you have to publish the site to your server to test it?

    Most sites created with vanilla Xara software should be testable on your hard drive.

  7. #7

    Default Re: preventing Google from caching pages

    yea, the site build isnt for me so it has to be uploaded to the server, it will go in a seperate folder for testing which already has a robots.txt with it. What Im concerned about is that I currently have an under construction message displayed for the url - I dont want the site to go live and google still show that as a cached page and take months to fall out of the cache

    on the other hand I dont want a robots.txt file to harm the sites ranking potential once it's removed after the site goes live

  8. #8

    Default Re: preventing Google from caching pages

    Remove the robots.txt when you're ready to go live.

    Alternatively, password protect the folder to allow only yourself and your client to view it. Google won't be able to get to it then.

  9. #9
    Join Date
    Sep 2000
    Location
    Bracknell, UK
    Posts
    8,659

    Default Re: preventing Google from caching pages

    Steve's password suggestion is good.

    I usually show client's their projects by adding a directory to my own domain and uploading their stuff there.

    If there is an existing site in place, it takes only one slip to overwrite the live one accidentally.

 

 

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •