@Stake Security Study: .NET 1.1 vs. WebSphere 5.0
I like competitive studies. I'm usually more interested in the methodology than the outcome. The methodology acts as a blueprint for what's important in a particular problem space.
One of my favorite studies was the original @Stake study comparing .NET 1.1 vs. IBM's WebSphere security, not just because our body of guidance made a direct and substantial difference in the outcome, but because @Stake used a comprehensive set categories and an evaluation criteria matrix that demonstrated a lot of depth.
Because the information from the original report can be difficult to find and distill, I'm summarizing it below:
Overview of Report
In June 2003, @Stake, Inc., an independent security consulting firm, released results of a Microsoft-commissioned study that found Microsoft's .Net platform to be superior to IBM's WebSphere for secure application development and deployment. @stake performed an extensive analysis comparing security in the .NET Framework 1.1, running on Windows Server 2003, to IBM WebSphere 5.0, running on both Red Hat Linux Advanced Server 2.1 and a leading commercial distribution of Unix..
Findings
Overall, @stake found that:
- Both platforms provide infrastructure and effective tools for creating and deploying secure applications
- The .NET Framework 1.1 running on Windows Server 2003 scored slightly better with respect to conformance to security best practices
- The Microsoft solution scored even higher with respect to the ease with which developers and administrators can implement secure solutions
Approach
@stake evaluated the level of effort required for developers and system administrators to create and deploy solutions that implement security best practices, and to reduce or eliminate most common attack surfaces.
Evaluation Criteria
- Best practice compliance. For a given analysis topic, to what degree did the platform permit implementation of best practices?
- Implementation complexity. How difficult was it for the developer to implement the desired feature?
- Documentation and examples. How appropriate was the documentation?
- Implementor competence. How skilled did the developer need to be in order to implement the security feature?
- Time to implement. How long did it take to implement the desired security feature or behavior?
Ratings for the Evaluation Criteria
- Best Practice Compliance Ratings
- Not possible
- Developer implement
- Developer extend
- Wizard
- Transparent
- Implementation Complexity Ratings
- Large amount of code
- Medium amount of code
- Small amount of code
- Wizard +
- Wizard
- Quality of Documentation and Sample Code Ratings
- Incorrect or Insecure
- Vague or Incomplete
- Adequate
- Suitable
- Best Practice Documentation
- Developer/Administrator Competence Ratings
- Expert (5+ years of experience
- Expert/intermediate (3-5 years of experience)
- Intermediate
- Intermediate/novice
- Novice (0-1 years of experience)
- Time to Implement
- High (More than 4 hours)
- Medium to High (1 to 4 hours)
- Medium (16-60 minutes)
- Low to Medium (6-15 minutes )
- Low (5 minutes or less )
Scorecard Categories
The scorecard was organized by application, Web server and platform categories. Each category was divided into smaller categories to test the evaluation criteria (best practice compliance, implementation complexity, quality of documentation, developer competence, and time to implement).
Application Server Categories
- Application Logging Services
- Exception Management
- Logging Privileges
- Log Management
- Authentication and Access Control
- Login Management
- Role Based Access Control
- Web Server Integration
- Communications
- Communication Security
- Network Accessible Services
- Cryptography
- Cryptographic Hashing
- Encryption Algorithms
- Key Generation
- Random Number Generation
- Secrets Storage
- XML Cryptography
- Database Access
- Database Pool Connection Encryption
- Data Query Safety
- Data Validation
- Common Validators
- Data Sanitization
- Negative Data Validation
- Output Filtering
- Positive Data Validation
- Type Checking
- Information Disclosure
- Error Handling
- Stack Traces and Debugging
- Runtime Container Security
- Code Security
- Runtime Account Privileges
- Web Services
- Credentials Mapping
- SOAP Router Data Validation
Host and Operating System Categories
- IP Stack Hardening
- Protocol Settings
- Service Minimization
- Installed Packages
- Network Services
Web Server Categories
- Architecture
- Security Partitioning
- Authentication
- Authentication Input Validation
- Authentication Methods
- Credential Handling
- Digital Certificates
- External Authentication
- Platform Integrated Authentication
- Communication Security
- Session Encryption
- Information Disclosure
- Error Messages and Exception Handling
- Logging
- URL Content Protection
- Session Management
- Cookie Handling
- Session Identifier
- Session Lifetime
More Information
For more information on the original @stake report, see the eWeek.com article, .Net, WebSphere Security Tested.
Comments
Anonymous
April 01, 2006
It was gratifying to see the information model that we used for vulnerability categories matched closely with @Stake’s frame of evaluation. I still remember the innumerable iterations we went through before we arrived at that frame. Another important learning that came out of that exercise was capturing the security principles and application scenarios. Their technology agnostic nature & not bound-to-a-feature characteristic are worth tattooing – speaking of tattoo … we did create & distributed tattoos of the security vulnerability categories as part of our Ship Party ;-)Anonymous
December 24, 2007
Book building is art and science. I've built a few books over the years at patterns & practices.Anonymous
December 24, 2007
Book building is art and science. I've built a few books over the years at patterns & practicesAnonymous
December 24, 2007
PingBack from http://msdnrss.thecoderblogs.com/2007/12/24/building-books-in-patterns-and-practices/Anonymous
March 09, 2009
The comment has been removed