user image

Preeti Tripathi

Digital Marketing
Digital Marketing
2 years ago

57. What Is Robot.txt & Why We Use It?

user image

Abhishek Mishra

2 years ago

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, block indexing with noindex or password-protect the page.

Recent Doubts

Close [x]