Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

about the Robots.txt Generator tool!

Robots.Txt is a file that includes instructions on how to crawl a internet site. It's also known as robots exclusion protocol, and this fashionable is used by websites to tell the bots which a part of their website needs indexing. Also, you may specify which regions you don’t need to get processed through those crawlers; such regions contain duplicate content or are under improvement. Bots like malware detectors, email harvesters don’t comply with this widespread and will experiment for weaknesses for your securities, and there may be a substantial probability that they will begin inspecting your web site from the regions you don’t need to be indexed.

A complete Robots.Txt document contains “consumer-agent,” and below it, you may write different directives like “allow,” “Disallow,” “crawl-put off” etc. If written manually it might take numerous time, and you could input multiple traces of instructions in a single record. In case you need to exclude a web page, you may need to put in writing “Disallow: the hyperlink you don’t need the bots to go to” equal goes for the allowing characteristic. In case you assume that’s all there is within the robots.Txt record then it isn’t easy, one wrong line can exclude your page from indexation queue. So, it's far better to go away the challenge to the pros, allow our Robots.Txt generator contend with the record for you.