Random
Source Code

robots.txt

File that stops search engines from crawling a specified page or directory.

Disallow all search engines to crawl the specified directories - /images and /cgi-bin:

User-agent: *
Disallow: /cgi-bin/
Disallow: /images/

by KFM December 24, 2004

15👍 3👎


robots.txt

it's a file that makes google filter the search result or not

did you edit the robots.txt file?

by Mr.Crybik December 4, 2019