What is Robots.txt?

Robots.txt file is a text file which is uploaded on the root of the website. The main task of this file is to instruct search engine crawler that which site they have to visit and which are not.
 
Robots.txt is the text file tool to crawl all web pages faster to index them for all top search engines
 
It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty.
 
Hi everyone,
A file which sits in the root of a site and tells search engines which files not to crawl. Some search engines will still list your URLs as URL only listings even if you block them using a robots.txt file.:)

Technically perfect answer amisha. Answer says a lot words in just two lines.
 
we make use of the robots text file to tell the search engine about which page we dont want to crawl by the search engine bots.
 
Robot text are used by the programmers a part of coding from the bots of google when the content is crawled by search engine bots
 
Robots.txt is a text file which we upload on our website to tell the various search engines about which part of our website we dont to crawl or index.
It takes the following format:
User Agent: Search engine spider
Disallow: link of our website
 
what is robots.txt?
>It is an html tag placed on the source of a webpage which directs search engine spiders which files to crawl on or not.
 
Back
Top