Java Pet Store is an example application in the "J2EE Blueprints" series. It documents best practices, design patterns, and architectural ideas for J2EE applications. Recently, gotdotnet.com, a Microsoft-sponsored Web site, implemented the same application in C# and .NET. Compared to Java Pet Store, Microsoft claims that the .NET version requires one-third the lines of code (LOCs) and provides 28 times faster average response times (for 450 concurrent users), requires one-sixth the CPU utilization, and scales much better as the number of users increases. Microsoft also argues that the architecture of .NET is superior. (Sun has provided a partial response.)
I did a project this year that started with the Pet Store as a springboard, so I decided to examine these claims. When done properly, implementing the same application with two different technologies can be a good way to compare the technologies. In this case, however, there are a number of problems with these claims:
In this article, I show that experimental flaws invalidate the performance results and that the code comparisons don't address developer productivity or framework superiority. I also discuss how the presentation-tier technology in .NET has some advantages over the J2EE technology, which the Java community should address.
This catfight started when Oracle published a comparison of their Oracle9iAS application server against an unnamed J2EE "Application Server X." Oracle used Java Pet Store as a test application and claimed superior performance for Oracle9iAS.
These results aren't important for our discussion here. We note, however, that Oracle used Sun SPARC servers running Solaris in a three-tier deployment: a client tier running an Oracle proprietary client simulator, a middle tier for the business logic and page construction, and a database tier.
Also, Oracle fixed some performance bugs and modified the Pet Store code and schema to improve scalability for a large number of users. (Microsoft labels the modified application "highly tuned.") Oracle also limited the tests to those pages with better scalability characteristics.
Most importantly, Oracle admitted that the Pet Store is not designed as a performance benchmark. They used it because ECPerf benchmarks were not yet available. In fact, Pet Store was designed as a learning tool, not as a performance tool. To avoid obscuring code, most optimizations were avoided. Furthermore, Oracle observed that their tests didn't exercise the Enterprise Beans very heavily. Most of the work happens in the database and in the Data Access Objects (DAOs), which are Java objects that are used to provide a database-agnostic interface to persistence (one of the design patterns emphasized in Java Pet Store). Since the EJBs are the meat of a J2EE application, useful tests must give them appropriate weight.
Microsoft implemented the Pet Store in C# and .NET. They did not implement the Administrator or Mail applications that are part of Java Pet Store. They were careful, however, to exclude those parts of Java Pet Store in their comparisons.
Microsoft reported the dramatic performance numbers mentioned in the introduction. Unfortunately, they made a serious mistake: rather than re-run the Oracle tests in their environment, they simply quoted Oracle's results.
They used a three-tier test bed with Intel servers, Windows 2000, and SQL Server, which is very different from the Sun/Oracle test bed used by Oracle. This means that unknown, potentially large systematic differences are present in the numbers:
Hence, meaningful performance comparisons between J2EE and .NET can't be made here. The differences might be real, but we just can't tell. Furthermore, since J2EE and .NET are designed for large-scale, highly concurrent, high-availability applications, Pet Store tests say little about how well these frameworks support those applications. Finally, performance should be put into perspective. The real question isn't "which is faster," but "if approach X satisfies my higher-priority criteria, will it be fast enough?"
Microsoft counted the lines of code in the two applications as a rough measure of developer productivity. They argue that .NET is a more full-featured and productive environment, because the developer needs to write (and test!) less code.
Microsoft wrote a basic parsing tool that skips comments and blank lines. It looks for specific types of code in a file, such as lines of Java code in a JSP file. I used their LOC tool to reproduce their numbers. In most cases, my counts showed small, but insignificant, differences.
Microsoft skipped the HTML files and HTML tags in JSP and ASP files. Since designer tools are often used to generate HTML, counting this type of source only measures arbitrary differences in how tools output tags. These files are still hand-coded by many developers, however, and one of the advertised strengths of both ASP and JSP is the ability to streamline page development with server-side directives, custom tag libraries, JavaBeans, C# code, etc. Therefore, counting lines with tags or perhaps the number of tags seems appropriate, as long as the two visual designs are close.
Curiously, Microsoft did not apply the tool-generation argument to other parts of the implementations. For example, they point out that XML descriptor files for J2EE applications are often bigger and more numerous than configuration files for .NET deployments. But deployment tools generate most of this content (not to mention the fact that J2EE descriptor files can contain far more information than comparable .NET files). Why not exclude this code from the counts?
As usual, LOCs say little about productivity. The real questions include:
In comparing the two code bases, Microsoft argues that the .NET architecture is superior. In fact, while the two feature sets are the same, they are really comparing implementations that satisfy two different sets of "nonfunctional" requirements.
Java Pet Store, like Java and J2EE themselves, emphasizes hardware, OS, and database portability. Performance is secondary. The .NET implementation, like .NET itself, promotes a highly integrated, single-vendor solution, where performance is emphasized. In fact, these contrasting sets of "requirements" could be implemented using either framework.
This divide is best illustrated by the discussion of the middle tier. The .NET middle tier is quite small, about 700 LOCs in 10 files, while the Java Pet Store has about 5400 LOCs in 120 files.
Microsoft used SQL Server and exploited stored procedures for basic queries, while keeping business logic in the middle tier. This approach reduced the amount of code in the middle tier and increased performance. Also, Microsoft returned XML from the database, rather than rowsets, which streamlined working with the data in the middle tier. The .NET middle tier is small and fast, but coupled to SQL Server.
Java Pet Store demonstrates the worst-case scenario; managing persistence manually using Bean Managed Persistence (BMP) and making the application support multiple databases transparently. This results in lots of the Data Access Objects mentioned previously and avoids all vendor-specific database optimizations. Code piles up, performance is not as good, and the design is more complex. Many projects don't face these constraints. In my project that started with Java Pet Store, we simplified some aspects of the design that were overkill for our less stringent requirements.
So, Microsoft's analysis of the middle tier is an "apples and oranges" comparison. How would Java Pet Store compare if it used Container Managed Persistence (CMP) or supported only one vendor's database and exploited available optimizations? How would .NET Pet Store compare if it supported multiple databases and was database-agnostic?
The most informative discussion in the Microsoft analysis concerns the presentation tier. .NET has a straightforward component model that connects Web page construction, form handling, and the middle tier in intuitive ways. Component packaging is intuitive, because ASP files are bundled with the corresponding C# objects. ASP.NET provides powerful Web Forms and User Controls to accelerate client development, but at the expense of client portability. Microsoft also lavishes attention on developer tools, making it easier to construct and connect these pieces.
The organization of Servlets, JSPs, and JavaBeans more closely resembles the traditional separation between static HTML pages, which go in one place, and server-side executable code, which goes in other places. The organization is less component-oriented and intuitive, although bundling complete applications into WAR and EAR files makes deployment straightforward. Also, Java Pet Store implements design patterns for event handling and flow control, because J2EE doesn't manage these things natively. With J2EE, you get more flexibility, but more work is required and the learning curve is a bit longer.
In my project, I replaced the Pet Store presentation tier with an Apache Struts approach. Besides providing useful taglibs, Struts provides a framework for managing flow control and event handling. The Struts framework provides many of the conveniences found in ASP.NET.
So, the presentation developer can choose ASP.NET for its well-designed framework and excellent tools, but with limitations on client and server portability. Or, the developer can choose JSP and Servlets (or a Java/XML/XSLT alternative) with less tool support and a slightly longer learning curve, but with more power, flexibility, and third-party options, like Struts.
Finally, .NET has extensive caching directives that make it easier for developers to optimize page delivery. J2EE caching optimizations have largely been left up to developers and vendors. The J2EE community should provide better standard performance capabilities.
The experimental flaws of the Microsoft tests render the performance comparisons unusable. All tests must be run on the same test bed and a more suitable application must be chosen. J2EE and .NET are most appropriate for large-scale, high availability applications. The documented tests say little about how well these frameworks support those applications.
The .NET and Java Pet Stores support the same features, but they implement different "nonfunctional" requirements. The .NET version assumes a single hardware/OS/database combination and makes performance paramount. The Java version supports multiple hardware/OS/database combinations and ranks performance as less important. In fact, both frameworks can support either emphasis. Hence, comparing the two code bases is misleading.
For developers who are comfortable with limited choices, .NET is a well-designed framework with good tools. J2EE provides greater freedom, but the J2EE community can't ignore the need for tools that create powerful and efficient applications in a timely manner.
Dean Wampler is a Software Engineer with DRW Trading. He was formerly a Consultant, Trainer, and Mentor with Object Mentor, Inc.
Return to ONJava.com.
Copyright © 2009 O'Reilly Media, Inc.