1 Attachment(s)
Re: robots.txt for subdomain
Megg - I am not certain, but I think when you fill in the URL for the site you are publishing, the robots.txt is created for that domain.
There is one sure way to find out. Publish to this sub-domain, then check the published files for the robots.txt file.
Re: robots.txt for subdomain
Megg, I think you need to add <meta name="robots" content="nofollow" /> in the sub-domain's Website HTML Code (head) textbox if you are using a Xara product.
Search Google for "nofollow"; you include it in the HEAD tag of each page of your sub-domain.
Acorn
Re: robots.txt for subdomain
@Acorn - Does this mean that Megg's site will or will not be indexed?
Re: robots.txt for subdomain
sorry, I should have said - the .com domain will have a xara site published to it, the subdomain is a blog only - so no code to edit on the page, its a redirect from another address to the subdomain.
I've always created my robots.txt files manually after site publish - but not too sure how to tackle not indexing the subdomain only without blocking the whole site from the search engines
Re: robots.txt for subdomain
OK that clears things up. Somewhat. :rolleyes:
Then you could add the code that Acorn provided to the pages that you do not want indexed.
Re: robots.txt for subdomain
Thank you for your advice. I have found that the blog has a section for custom robots.txt so I dont have to upload it to the server separately. I have inserted the following:
User-agent: *
Disallow: /
As I want the entire blog not to be indexed - so hopefully this will work.
Re: robots.txt for subdomain
Check the index status in google webmaster tool,also verify there any alert related to this robots.txt