Which file lists the directories and files to be hidden from web crawlers?

Prepare for the Certified Ethical Hacker Version 11 Exam. Study with comprehensive questions and explanations. Equip yourself with the skills needed for success!

Multiple Choice

Which file lists the directories and files to be hidden from web crawlers?

Explanation:
The concept you’re testing is how web crawlers are guided about what to crawl or ignore. The file that lists directories and files to be hidden from web crawlers is robots.txt. It sits in the root of a website and uses directives like User-agent and Disallow to tell compliant crawlers which paths should not be fetched or indexed. Remember, this file is publicly accessible and only serves as a hint to crawlers; it doesn’t provide real security. For true protection, rely on proper authentication and access controls. The other options are different kinds of tools not related to instructing crawlers: NCollector Studio is a data collection tool, Acunetix WVS is a web vulnerability scanner, and Hashcat is a password-cracking tool.

The concept you’re testing is how web crawlers are guided about what to crawl or ignore. The file that lists directories and files to be hidden from web crawlers is robots.txt. It sits in the root of a website and uses directives like User-agent and Disallow to tell compliant crawlers which paths should not be fetched or indexed. Remember, this file is publicly accessible and only serves as a hint to crawlers; it doesn’t provide real security. For true protection, rely on proper authentication and access controls. The other options are different kinds of tools not related to instructing crawlers: NCollector Studio is a data collection tool, Acunetix WVS is a web vulnerability scanner, and Hashcat is a password-cracking tool.

Subscribe

Get the latest from Passetra

You can unsubscribe at any time. Read our privacy policy