Neohapsis is currently accepting applications for employment. For more information, please visit our website www.neohapsis.com or email email@example.com
RE: WebVulnCrawl searching excluded directories for hackable web servers
From: Michael Scheidell (scheidellsecnap.net)
Date: Wed Mar 29 2006 - 06:51:19 CST
Just a quick followup and clarification:
> -----Original Message-----
> From: Michael Scheidell
> Sent: Wednesday, March 15, 2006 8:38 AM
> To: bugtraqsecurityfocus.com
> Subject: WebVulnCrawl searching excluded directories for
> hackable web servers
> What he is doing is a violation of the RFC's (governing
> robots.txt.. Yes, hackers do that also)
There was an RFC proposed and looked at in 1996, but never adopted.
> The robots.txt file is NOT AN ACCESS CONTROL LIST, and SHOULD
> NOT BE USED TO 'HIDE' DIRECTORIES. ALL DIRECTORIES SHOULD BE
> PROTECTED AGAINST Directory listing.
Someone mentioned that sometimes you want directory listings.
That should have suggested turning off directory listing for any
directories you don't want listed.
(I don't know why you would put them in robots.txt)
WebVuln Blog stated he was only hitting .com sites.
I have evidence he has moved to .org sites, and in fact, has hit a US
government site as well.
I would hope this US government IT security folks would know not to use
robots.txt as an ACL, the web folks aren't always security folks (web
aplications themselves are sometimes prone to SQL injextion, XSS
attacks, PHP coding errors) and since there is a large gap between
applications and web development, the chances of accidentially gathering
information that should not be gathered is huge.
Every security person should review the robots.txt file on their web
site for implications.
> Further, dshield shows them portscanning the net also,
> looking for unpublished information on unpublished servers.
So does mynetwatchman:
Michael Scheidell, CTO
561-999-5000, ext 1131
SECNAP Network Security Corporation