user image

Preeti Tripathi

Digital Marketing
Digital Marketing
2 years ago

72. Why Is A Robots.txt File Used?

user image

Abhishek Mishra

2 years ago

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google.

Recent Doubts

Close [x]