Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Now, you can easily create 'robots.txt' file at your root directory. Copy above text and paste into the text file. Easy to use robots.txt file generator with instructions for beginners.

Robots.txt Generator generates a file that is very much opposite of the sitemap which indicates the pages to be included, therefore, the robots.txt syntax is of great significance for any website. Whenever a search engine crawls any website, it always first looks for the robots.txt file that is located at the domain root level. When identified, the crawler will read the file, and then identify the files and directories that may be blocked.

Robots.txt generator tool allows the famous search engines like Google, Bing, Yahoo to slink every part of your website. If there are areas you wish to exclude, simply add them to this file and upload it to your root directory.

Why Robots.txt generator tool is important?

Robots text generator is a very important and most functional tool in upgrading your site’s ranking and visibility rate. Before anything else, you should understand the importance of a robot text first. Using this free tool you can create a brand new or edit a present robots.txt file on your website with a robots.txt generator.

It is a very useful tool that has made the lives of many webmasters easier by helping them making their websites Googlebot friendly. It is a robot.txt file generator tool that can generate the required file by performing the difficult task within no time and for absolutely free.

Create custom user agent directives

In our robots.txt generator, Google and several other search engines can be specified within your criteria. To specify alternative directives for one crawler, click the User Agent list box (showing * by default) to select the bot.

When you click Add directive, the custom section is added to the list with all of the generic directives included with the new custom directive. To change a generic Disallow directive into an Allow directive for the custom user agent, create a new Allow directive for the specific user agent for the content. The matching Disallow directive is removed for the custom user agent.

How to Use Our Robots.txt Generator Tool?

Using our amazing tool, you can generate a robots.txt file for your website by following these few easy and simple steps:

  • By default, all robots are allowed to access your site’s files, you can choose the robots you want to allow or refuse the access.
  • Choose crawl-delay which tells how much delay should be there in the crawls, allowing you to choose between your preferred delay duration from 5 to 120 seconds. It is set to ‘no delay’ by default.
  • If there already is a sitemap for your website, you can paste it in the text box. On the other hand, you can leave it blank, if you don’t have.
  • List of search robots is given, you can select the ones you want to crawl your site and you can refuse the robots you don’t want to crawl your files.
  • The last step is to restrict directories. The path must contain a trailing slash "/", as the path is relative to root.
  • In the end, when you are done generating Googlebot friendly robots.txt file with the help of our Robots .txt Generator Tool, you can now upload it to the root directory of the website.

If you wish to explore our friendly tool before using it then feel free to play with it and generate a robot.txt example.