Free online file generator Robots.txt for the website

Free SEO tools

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

When search engines scan a site, they first look for a file robots.txt at the domain's root. If detected, they read the list of file directives to see which directories and files are from scanning. This file can be using our file generator robots.txt. When you use the generator robots.txt, Google and other search engines can determine which should exclude pages of your site and which should not. In other words, the file created by the generator robots.txt is a sitemap, which indicates which pages to include.

 

File robots.txt tells the search engine which subpages of your site to index and which not.

 

  • Batch settings for all commands in the generator
  • Robots.txt, you can create both text and file
  • All major search robots are available
  • Setting parameters in the generator

The generator facilitates your work by requesting all the critical information. Below is a step-by-step instruction for utilizing the generator robots.txt:

Spiders can be wholly excluded or even allowed (recommended) in the first field. If you prohibit all search robots from indexing your site, all other areas of the generator will become superfluous.

 

Select whether there should be a scan delay. If so, the corresponding scanner can only visit your site every 5, 10, 20, 60, or 120 seconds.

Place the XML sitemap in the third box. Keep this area empty if you don't have a sitemap, or use our no-cost XML sitemap generator to make one.

 

The generator provides you with a list of all standard search engines. If you are only interested in particular search engines, you can allow or deny access to them individually.

 

Finally, specify all the pages and folders that should not be indexed. Make sure each path has its field and ends with a slash. Should you make a mistake or decide you want to follow a different course of action, you may modify your robots.txt entries at any moment. You may start anew by clicking the "Reset" button at the very bottom of the screen.

Create a file for free

After you enter all the information into the tool, the generator will create a file for your robots.txt. You can choose one of two options:

 

After clicking the green button labeled "Create robots.txt, " a file is created in the white field under the robotos.txt control. You can browse, browse and copy the file there.

 

The red button labeled "Create and save as robots.txt " leads to the same result. However, you will also receive a ready-made file that you can save directly to your hard disk.

robots.txt storage in the root directory via your FTP

After creating the file robots.txt, it needs to. It's usually straightforward: save the file as robots.txt and upload it via FTP server or Hosting to the domain's root directory.

 

Once this, each scanner must first interpret this file and tells him whether he is allowed to visit and index your site or individual subpages. If you have any questions or storage issues with robots.txt, please do not hesitate to contact us.

Frequently Asked Questions about the file robots.txt:

 

What for robots.txt?

File robots.txt to control the behavior of search engine robots. If, for example, you do not want bots to have access to a particular subpage, this subpage is blocked for robots via a file robots.txt with the "Prohibit" command.

 

Where can I find the file robots.txt?

File robots.txt at the first level of your site's tree. You can call the file by typing www.домен.ru/robot.txt in the address bar of the browser. You can find the file via FTP access from your hoster.

 

Is it possible to create a file robots.txt on your own?

Yes. Using our free generator robots.txt, you can easily create and configure the file.

 

How do I check if the file is correct?

You can check this by looking at all the sites mentioned in the robots.txt file. Here you have to decide whether you can include the corresponding page in the index or not. It would help if you listed All site pages in the "Allow" section, and only those areas of your page that are not allowed in the index should appear in the "Prohibit" section.

 

Is the file robots.txt mandatory?

No File robots.txt is optional, but it gives some advantages regarding search engine optimization.

 

Why should the XML sitemap be in the file robots.txt?

When calling your site, the search engine always comes across a file first, robots.txt. If you have saved your sitemap.xml, there is a high probability that your page will be without errors.