OSEC

Neohapsis is currently accepting applications for employment. For more information, please visit our website www.neohapsis.com or email hr@neohapsis.com
 
From: Talisker (TaliskerNETWORKINTRUSION.CO.UK)
Date: Wed Jan 10 2001 - 12:36:24 CST

  • Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]

    Carv

    Yep found it :o) Thanks ;oř

    This kind of evaluation is just what we need, to allow us to see how
    products compare in a realistic environment. All too often we see magazines
    (I won't name them) who evaluate products, compare Host IDS, Network IDS,
    Vulnerability Scanners etc simultaneously. Obviously because these products
    have different roles they will produce vastly different results.

    It was good to see Marc from eeye put across his reply to the eval, are
    there any other vendors wanting to do the same without belittling their
    competitors? What I'm looking for is whether they felt the eval was fair,
    in what way they have updated their products since the eval and what their
    plans are for the near future.

    I would also agree with Marc about the version number of the evaluated
    products, including the update number where applicable

    Thanks Again Greg

    Andy
    http://www.networkintrusion.co.uk
    Talisker's Network Security Tools List
                        '''
                     (0 0)
      ----oOO----(_)----------
      | The geek shall |
      | Inherit the earth |
      -----------------oOO----
                   |__|__|
                      || ||
                  ooO Ooo
    taliskernetworkintrusion.co.uk

    The opinions contained within this transmission are entirely my own, and do
    not necessarily reflect those of my employer.

    ----- Original Message -----
    From: "Marc Maiffret" <marceeye.com>
    To: <FOCUS-MSSECURITYFOCUS.COM>
    Sent: Tuesday, January 09, 2001 5:36 PM
    Subject: Re: NetworkComputing testing of vulnerability scanners

    > When Network Computing set out to write their security scanner review the
    > current released version of Retina was 2.0. We did have Retina 2.5 beta on
    > our site for about 2 months before their test date, however when we
    received
    > the invitation our marketing department shipped them Retina 2.0 because it
    > was the current cd burned release version.
    >
    > Since their test date we have released Retina 3.0. Retina 3.0 does many
    more
    > checks than Retina 2.0 (including a lot of remote Unix checks) and is much
    > much faster. As its shipping now Retina 3.0 covers 16 out of the 17
    > vulnerabilities listed in the Network Computing article. The last one we
    are
    > adding by the end of the week. While we do think that those 17
    > vulnerabilities were important ones, we definitely do not agree that a
    > product review should be mainly (70%) based off of 17 vulnerabilities.
    >
    > All in all there is not much to say about the review.... the review was
    dead
    > on on how Retina 2.0 rated. However, Retina 3.0 is a very different
    product
    > and would do much better. Although next time we'd wish that Network
    > Computing could actually put version numbers on their product reviews.
    >
    > To download Retina 3.0 go to:
    > http://www.eEye.com/retina/
    >
    > Please feel free to send any feedback to the Retina product development
    team
    > at retinaeeye.com.
    >
    > Signed,
    > Marc Maiffret
    > Chief Hacking Officer
    > eCompany / eEye
    > T.949.349.9062
    > F.949.349.9538
    > http://eEye.com
    >
    >
    > | -----Original Message-----
    > | From: Focus on Microsoft Mailing List
    > | [mailto:FOCUS-MSSECURITYFOCUS.COM]On Behalf Of H Carvey
    > | Sent: Tuesday, January 09, 2001 5:46 PM
    > | To: FOCUS-MSSECURITYFOCUS.COM
    > | Subject: NetworkComputing testing of vulnerability scanners
    > |
    > |
    > | I found this article on the PacketStorm site,
    > | thought others might be interested...
    > |
    > | http://www.nwc.com/1201/1201f1b1.html
    > |
    > | The article is fairly thorough, explicitly
    > | detailing the testing methodology, how the
    > | scanners were set up, etc.
    > |
    > | One thing was really concerning...not only did
    > | Retina score the lowest of all (the testing
    > | methodology was explicit...5 hosts were set
    > | up with 17 vulnerabilities, most of which come
    > | from the SANS list), but here's a quote:
    > |
    > | "Ironically, we discovered that Retina didn't
    > | deliver on the IIS vulnerabilities that eEye's
    > | own research team discovered. If eEye
    > | focused more on vulnerability scanning than
    > | the GUI and CHAM, Retina might be better
    > | equipped to match some of the other
    > | products."
    > |
    > | Does anyone out there have any experiences
    > | that are directly counter to NWC's report? I'd
    > | like to think that Retina had done better...the
    > | only positive comments by NWC regard the
    > | GUI, while they are really negative with
    > | regards to CHAM and vulnerability scanning.
    > |
    >