Sunday, November 2, 2008

Lessons learned from iPOJO testing process

Recently, we ask me several times if iPOJO is tied to the Felix runtime (i.e. works only on Felix). So, the answer is simple:

iPOJO relies only on the OSGi 4.1 specification, and so can work on any compliant implementation.

This post just explains how iPOJO is currently tested, and reflexion on OSGi application testing.

Test, Test and Test...

So, why testing?
Just because I'm very clumsy, and I can't guaranty that a patch don't have side effects broking a features. So, my tests aims to check that features still works and that I don't broke everything... That's why I developed junit4osgi at the same times as iPOJO. Before this improvement, testing iPOJO takes something like a working day! (now, it's close to 3 minutes...)

The iPOJO testing process weekly executed on three different OSGi implementations:
"Tested" means that the test suite is executed successfully on each implementation. This is done thanks to the junit4OSGi framework compatible with those implementations.
The test suite is also executed on different VMs such as Mika, JamVM and JRockit. Those tests are twofold:
  1. Checks that the iPOJO framework can be executed on different VMs.
  2. Checks that the iPOJO manipulation process generates consistent bytecode.
The second point is important, as the VMs are more or less tolerant to some mistakes ☺. Specially, Sun VMs are very tolerant!

So, let’s back to the testing process. The iPOJO test suite checks the most part of the iPOJO’s features. The iPOJO trunk contains a lot of test cases executed with the junit4osgi framework. Executing the test suite is quite simple: launch the OSGi implementation, launch the junit4osgi framework, deploy tests and finally use a front end (command line, GUI) to run tests.

Simple, isn’t it? However, during the development of the test suite, several issues appears:
- Handling asynchronous interactions
- Integrating test in the building process

Issues to test OSGi Applications

Testing services and their behavior is really easy with junit4OSGi. However, it becomes trickier when services realize asynchronous actions (i.e. actions are executed in a different thread and so not sequentially).

OSGi proposes an execution platform where a lot of actions are made asynchronously. Testing this kind of interaction is difficult, and generally requires magic waiting time.

For example, imagine a test pushing a configuration inside the configuration admin. The test creates a new configuration, pushes it to the configuration to the configuration admin, and checks that the configuration is correctly applied. However, the configuration admin applies the configuration in another thread. So the test must wait before checking that the configuration is correctly applied. Waiting is admitted, but how much time? Choosing a default time is rarely is good choice because it greatly depends on your configuration… Setting a long duration implies a long executing time…

So, you can argue that the Configuration Admin provide mechanisms notifying when configurations are correctly processed. That’s great, and the OSGi specification defines similar mechanisms for the most part of asynchronous interactions. So it should be possible to wait for this notification (or eventually throws a timeout).


However, if the ManagedService or the ManagedServiceFactory to update is not available, the notification is fired immediately, and is no more fired once the service arrives (at least on the Felix Configuration Admin). So, in this case you have to wait for an arbitrary time ☹. I’m looking for a better solution, but right now it still obscures to me how to handle those interactions cleanly.

The second issue with junit4osgi is very different. Executing tests are easy but I’m very (very) lazy. So, integrating those tests inside my build process could be great. An Ant task or a Maven plug-in could execute those tests automatically and react according to the results (if a failure or an error is detected, the build process fails). This automation can be easily implemented. It just requires starting an OSGi implementation, deploying the junit4osgi framework, starting required bundles as well as bundles containing tests and launching the test execution. It exist testing framework doing this, but they are generally usable only inside the build process and so lacks of flexibility. It must be possible to execute test manually on non-standard configurations. I’m going to implement a simple Ant task doing this. I’ll keep you posted…

1 comment:

Redhuan D. Oon said...

Hi, i came across IPojo via way of JUnit In Action 2nd Edition talking about JUnit4OSGi. I am doing testing suites onto our ERP project which we can put up at a Jenkins server http://berliserv.net:8080 (http://sf.net/projects/red1) and need to look into OSGi as our next quantum jump of our project.

I have found little alternative reference in the web to give an idea how best to test OSGi components and so i be reading your blogs intensely to see if this works. I will be glad to submit write-ups on the experience.

By the way i was in Berlin earlier this month.

Cheers!