Neohapsis is currently accepting applications for employment. For more information, please visit our website www.neohapsis.com or email firstname.lastname@example.org
Re: Preventing cross site scripting
From: Tim Greer (chatmastercharter.net)
Date: Fri Jun 20 2003 - 11:08:18 CDT
----- Original Message -----
From: "Laurian Gridinoc" <laurgrapefruitdesign.com>
To: "Wojciech Purczynski" <cliphisec.pl>
Cc: "Tim Greer" <chatmastercharter.net>; <webappsecsecurityfocus.com>
Sent: Friday, June 20, 2003 9:21 AM
Subject: Re: Preventing cross site scripting
> The most elegant way to control html input would be to parse it to a DOM
> tree and control it from there; I'm widely using Tidy to `correct' the
> input to XHTML, then by a simple XSL transformation I can filter/alter
> whatever elements I need.
> You want to drop the `object' element, just write `<xsl:template
> match="object" />', you want to ignore it (preserve the content it
> wraps) just write `<xsl:template match="object"><xsl:apply-templates
> /></xsl:template>', you want to copy everything else:
> <xsl:template match="*|*|text()|comment()">
> <xsl:apply-templates select="*|*|text()|comment()" />
Can you give a real workd example of a URL link/anchor tag on how you would
allow or disallow it from becoming active based on specific variables that
would prevent an attack that would be a superior method over a regex
example, such as I offered?
> I consider filtering html as it was a mere string (i.e. using regexp or
> simple replace methods) pretty uncertain in results and not quite
> programming :) -- it's a language, it has a grammar, then use a parser.
I don't see how anything would be better than a regex, but everyone has
their preferences. TIMTOWTDI, I'm sure. You think regex's aren't quite
Tim Greer chatmastercharter.net
Server administration, security, programming, consulting.