Friday, September 29, 2006

Open source commercialism or: how I learned to stress test and prove the app

For any perspective software you may use to run a company, you will need a guaruntee that it will be able to handle what you throw at it. In the case of a bleeding edge, open source project that shows much potential you really want to make that software work. Besides the benefits of open source, such as the code being downloadable/editable/compilable, the software is free!!! Pair that with the fact that if it is a hot project then many eyes have glanced the codebase, and many users have filed many bug reports and feature requests on issues they have. Open source projects can be very commercialy viable: just look at Tomcat.

I am working with a ambitious piece of software called Servicemix (Smix). If things go well it could really help me and my company out. While we have been using it in our production system for over 3 months I have not had time to really stretch it out. What I mean is test more of the 'out of the box' functionality in stressed ways. Smix is an ESB which implements the Java Business Integration (JBI) specification. This specification is much like the Sun servlet spec, JDBC spec etc. Those Sun guys love their specs!

To really test out this software I have created some basic service assemblies (SA). A service assembly allows you to package together functionality that comprises any number of processes. It is also part of the JBI spec. An example of a SA would be a consumer for a JMS topic queue, which passes all messages to an XSLT transformer, which in turn passes the message to an app that processes the messsage and inserts the data into a database. This would all be packaged into one file for deployment. Those familiar with J2EE would think of it as a .ear file. If an SA is analogous to a .ear, then a service unit (SU) would be to .war. In the previous example the SU's would be the JMS routing, the XSLT services, and the database service. Each would be packaged as it's own service unit in Smix.

Now I also created a configurable JMS provider, which can be tasked with many different things. I created a configurable JMS consumer, which can read from a queue. Last night I sent 75,000 small messages through an SA I created which does two separate XSLT transforms. The transfromed results go onto two separate queues. The contents of those queues are consumed by two of the consumers I created. Everything went off hitch free! Very nice for my first real foray into stress testing Smix. Here are some approx. results:

Started message providing: ~9:40 pm

Message provider rate: ~6-7 message per second
Message consumer rate: 7.8125 messages per second

Message consumer 1 finished: 2006.09.28 AD at 22:38:17:108 PDT
Message consumer 2 finished: 2006.09.28 AD at 22:38:17:108 PDT

The message payload:

<message>
<count>(message #)</count>
<datetime>(date-time inserted)</datetime>
</message>

An example of a transformed message:
<ecometry>2 ::: 2006.09.29 AD at 07:47:41:996 PDT</ecometry>

2 comments:

James McGovern said...

Would your employer be willing to do a case study on servicemix implementation

robottaway said...

I could ask. What types of metrics are you looking for? What types of SOA patterns? So far I've done very little. Just feeding documents in through SOAP, transforming them with XSLT and feeding them out.