I added robots.txt on our preview server to prevent Google’s bot or Bing’s bot from indexing our business’s preview sites. Original robots.txt: User-agent: * Disallow: / Effectively it should stop all legitimate bots from indexing preview sites. However this also stopped SharePoint 2010’s search crawler from crawling the sites. Any full crawl gave me 0 results. Updated robots.txt: User-agent: Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 6.0 Robot) Allow: / User-agent: * Disallow: /
Idea, Thought, Opinion, Feedback, Challenges, Experiences on Microsoft platform.