Tuesday, December 9, 2008

Code coverage of OSGi applications

The question that disturbs me last night is about test quantity/quality. So, when I can say: « Ok, I’ve enough tests ». So, for sure, this question is open.

Freddy Mallet posted a comment on my last post about Sonar. Sonar (http://sonar.codehaus.org/) is an all-in-one code quality tracker. So, based on maven (and a set of plug-ins), it collects data about your code, tests … and assembles reports on one web site with useful metrics (efficiency, usability…).

I tested today and was really impressed. But, my code coverage metrics was desperately 0 ☹
After some web searches, I found the Cobertura maven plug-in (http://mojo.codehaus.org/cobertura-maven-plugin/) collecting code coverage during test execution thanks to the Cobertura framework (http://cobertura.sourceforge.net/). However, as my systems and applications are tested in an OSGi container with junit4osgi, I investigated how to collect the code coverage of tests executed with juni4osgi.

So, first it is possible after a very small hack of junit4osgi to manage cobertura packages.
Collecting the code coverage is a three-step process:

A] Creates a bundle with instrumented classes
First you need to select ONE project under test. You will collect metrics on this project. So you want several projects, you need to execute the process several times (one per project).
Once selected, edit the pom file to ignore the ‘net.sourceforge.cobertura.* packages such as in:


org.osgi.framework;version=1.3,
org.osgi.service.cm,
org.osgi.service.log,
!net.sourceforge.cobertura.*

Then, execute the following command to create the instrumented bundle:
mvn clean install cobertura:instrument package install
This command creates the bundle with the Cobertura instrumented code, and installs it in the local maven repository. Moreover, it creates a file containing code metadata in the target/cobertura folder (cobertura.ser).

B] Test execution
Once the bundle is created, you need to execute the test. The code coverage is computed during this execution. So to launch the execution, go in your integration-test project and execute the following command:
mvn clean integration-test
-Dnet.sourceforge.cobertura.datafile
=/path/to/your/tested/project/target/cobertura/cobertura.ser
The ‘net.sourceforge.cobertura.datafile’ property specified the location of the cobertura.ser file (created during the Cobertura code instrumentation).

Tests are executed normally… But when maven is stopping, two messages appear:
Cobertura: Loaded information on 81 classes.
Cobertura: Saved information on 81 classes.
This means that the metrics were correctly collected.

C] Generation of the report and analysis of the code coverage
Once collected, go back to your tested project. To generate the report, just launch the following command:
mvn cobertura:cobertura
The report is generated in the target/site/cobertura/ folder. Here is an example of generated report. You can also check your code coverage with:
mvn cobertura:check
This doesn’t create the report but just gives an idea of the code coverage. You can use the cobertura plug-in configuration to adapt the checking or the report to your requirements.

That’s it.

Saturday, December 6, 2008

Reducing the pain of a release process

Who never crack down during a release process? Despite several tools ease the process; it’s generally a nightmare.

Why? Because of the number of task to execute during the process: the product has to be deeply tested before the release, then license/notice files have to be checked as well as license header, the release candidates artifacts have to be created, deployed, signed, don’t forget the release note, and to update the documentation…

So, I recently discover two maven plug-ins reducing (a little) the pain. They automate two checks: license headers and the code style consistency.

First the checkstyle plug-in (http://maven.apache.org/plugins/maven-checkstyle-plugin/) checks that your code is consistently formatted. It is, in my opinion, a critical aspect. So, diving into an unknown code consistently formatted is really easier. Using the plug-in is quite simple. Add the following plug-in configuration in the build section of your pom file:

<plugin>
<groupid>org.apache.maven.plugins</groupid>
<artifactid>maven-checkstyle-plugin</artifactid>
<configuration>
<enablerulessummary>false</enablerulessummary>
<violationseverity>warning</violationseverity>
<configlocation>http://felix.apache.org/ipojo/dev/checkstyle_ipojo.xml</configlocation>
</configuration>
</plugin>

The ‘violationSeverity’ attribute sets the severity level from which the project build failed. So, if the code format breaks a rule with a level superior or equals to the set value, the build process failed. (Check levels can be set in the checkstyle file). The ‘configLocation’ attribute specifies the checkstyle file to use (containing the format rules).

So, once your project has the correct configuration, just launch the following command:
mvn checkstyle:check

to check if your project doesn’t break your format. If executed on a multi-module project, each module is checked (each module need to be configured).

So, thanks to this plug-in, checking the code format consistency is quite simple and can be automate during your project build process.

The second plug-in is the RAT plug-in (http://mojo.codehaus.org/rat-maven-plugin/). RAT (http://incubator.apache.org/rat/), is a tool to improve accuracy and efficiency when checking releases. It is heuristic in nature: making guesses about possible problems. It will produce false positives and cannot find every possible issue with a release. It reports require interpretation, but can also be automated…

In fact, RAT checks license header and tacks missing license. RAT provides a maven plug-in which can automate the missing license discovery during your build project. To use this plug-in, just add the following plug-in configuration in the build section of your pom file.

<plugin>
<groupid>org.codehaus.mojo</groupid>
<artifactid>rat-maven-plugin</artifactid>
<configuration>
<excludesubprojects>false</excludesubprojects>
<useeclipsedefaultexcludes>true</useeclipsedefaultexcludes>
<usemavendefaultexcludes>true</usemavendefaultexcludes>
<excludes>
<param>doc/**/*</param>
<param>maven-eclipse.xml</param>
<param>.checkstyle</param>
<param>.externalToolBuilders/*</param>
<param>LICENSE.*</param>
<param>.fbprefs</param>
</excludes>
</configuration>
</plugin>

This configuration will check that non-excluded files have a correct license header. It excludes maven files (target folder), eclipse files (.project, .classpath). Those files don’t require a license, as they should not be on the source code repository. Then, others files can be excluded such as the doc folder (containing html files and release notes, used license files (LICENSE. *)... Extend the list (or modify it) according to your project.

To check that you have no file without a license header, just execute the following command:
mvn rat:check

The command failed if a corrupted file is detected. Executed on a multi-module project (each module need to be configured), it checks every module and failed as soon as a module has an invalid file.

So, by using these plug-ins, you can continuously check the consistency of your code format and of the license headers. This is not wonderful, but at least do one’s share of the work.

Monday, November 24, 2008

22.7 Mb!

Today, all began perfectly.
I moved the junit4osgi plug-in to the iPOJO Trunk and decided with junit4osgi users to move the junit4osgi projects as ‘iPOJO top level’ project. And, icing on the cake, I finished my PHD Defense slides.

Moreover, I was pretty proud of my new script checking iPOJO current trunk:

mkdir tmp
cd tmp
svn co https://svn.apache.org/repos/asf/felix/trunk/ipojo
cd ipojo
mvn clean install -Pexamples,tests
mvn rat:check
mvn checkstyle:check
cd tests
cd integration-tests
mvn clean integration-test
mvn org.apache.maven.plugins:maven-surefire-report-plugin:report
/usr/bin/osascript -e 'tell application "Safari" to activate open location "file:/Users/clement/tmp/ipojo/tests/integration-tests/target/site/surefire-report.html"'

However, I executed it from scratch (empty maven repository), and here arrives the issue …
22.7 Mb is the size of my maven repository after the execution of the script ! It just downloads and compiles iPOJO trunk (not the whole Felix project) and executes tests.

After a quick analysis, some artifacts were downloaded in multiple versions. So, it’s time for me to track some of these artifacts, and to try to reduce this size. I’m pretty sure, that I can’t drastically reduce this mess, anyway, I’ll try.

Here are the duplicated artifacts:

commons-collections-2.1.jar
commons-collections-3.2.jar
commons-logging-1.0.3.jar
commons-logging-1.0.4.jar
commons-validator-1.1.4.jar
commons-validator-1.2.0.jar
org.osgi.compendium-1.0.0.jar
org.osgi.compendium-1.2.0.jar
org.osgi.core-1.0.0.jar
org.osgi.core-1.0.1.jar
org.osgi.core-1.2.0.jar
org.osgi.foundation-1.0.0.jar
org.osgi.foundation-1.2.0.jar
doxia-core-1.0-alpha-10.jar
doxia-core-1.0-alpha-8.jar
doxia-decoration-model-1.0-alpha-10.jar
doxia-decoration-model-1.0-alpha-11.jar
doxia-decoration-model-1.0-alpha-8.jar
doxia-site-renderer-1.0-alpha-10.jar
doxia-site-renderer-1.0-alpha-8.jar
maven-archiver-2.0.jar
maven-archiver-2.2.jar
maven-archiver-2.3.jar
maven-artifact-2.0.jar
maven-artifact-2.0.7.jar
maven-artifact-manager-2.0.jar
maven-artifact-manager-2.0.7.jar
maven-model-2.0.jar
maven-model-2.0.7.jar
maven-plugin-api-2.0.jar
maven-plugin-api-2.0.7.jar
maven-profile-2.0.jar
maven-profile-2.0.7.jar
maven-project-2.0.jar
maven-project-2.0.7.jar
maven-repository-metadata-2.0.jar
maven-repository-metadata-2.0.7.jar
maven-jar-plugin-2.1.jar
maven-jar-plugin-2.2.jar
maven-plugin-plugin-2.3.jar
maven-plugin-plugin-2.4.1.jar
maven-surefire-plugin-2.3.jar
maven-surefire-plugin-2.4.2.jar
maven-reporting-impl-2.0.jar
maven-reporting-impl-2.0.4.jar
maven-reporting-impl-2.0.4.1.jar
surefire-api-2.3.jar
surefire-api-2.4.2.jar
surefire-booter-2.3.jar
surefire-booter-2.4.2.jar
wagon-provider-api-1.0-alpha-5.jar
wagon-provider-api-1.0-beta-2.jar
plexus-archiver-1.0-alpha-3.jar
plexus-archiver-1.0-alpha-7.jar
plexus-archiver-1.0-alpha-9.jar
plexus-container-default-1.0-alpha-8.jar
plexus-container-default-1.0-alpha-9-stable-1.jar
plexus-i18n-1.0-beta-6.jar
plexus-i18n-1.0-beta-7.jar
plexus-utils-1.0.4.jar
plexus-utils-1.1.jar
plexus-utils-1.4.jar
plexus-utils-1.4.1.jar
plexus-utils-1.4.4.jar
plexus-utils-1.4.5.jar
plexus-utils-1.4.7.jar
plexus-utils-1.4.9.jar
plexus-utils-1.5.1.jar
plexus-velocity-1.1.2.jar
plexus-velocity-1.1.3.jar
plexus-velocity-1.1.7.jar
oro-2.0.7.jar
oro-2.0.8.jar

Tuesday, November 11, 2008

maven-junit4osgi-plugin

Update: the junit4osgi artifact has recently changed. The groupID became org.apache.felix. Moreover, the plugin is now in the iPOJO trunk.

In my last post, I explain the advantages of integrating integration test in the build process. However, such tools was inexistent until … now ☺

As I explained, the junit4osgi framework allows testing an OSGi applications with a "junit++" framework (i.e. an adapted junit distribution providing useful methods to handle easily OSGi specific features such as services).

But, the junit4osgi framework is a “standalone” framework and is not integrated in a build tools such as Ant or Maven. This post describes a maven plug-in, maven-junit4osgi-plugin, executing tests in a maven-based build process.

Constraints of such tools
Providing such front-end is useful but should meet some requirements:
Tests are executed “in container”. So, tested bundles are really deployed on an OSGi framework (I choose Apache Felix). So tests are executing in an execution environment close to the final (production) execution environment. The goal of integration tests is to test the application in a production-like environment.
  • The project under construction is not a functional bundle, but can either be empty or contain integration tests. Integration tests are not placed in the same project as the project under test. Integration tests are packaged inside others bundles deployed on the same framework as the application under test..
  • To recreate the “final” execution environment, several bundles can be required (technical services, the application under test…). So the plug-in must support the deployment of required bundles.
  • Test results must be reported to the user. Maven provides an infrastructure to create web site. Moreover, Surefire (the Maven “regular” test plug-in) provides a plug-in generating a web page with test result. The provided plug-in should provide the same features and should reuse the same format as Surefire.
The good news is that the provided plug-in, maven-junit4osgi-plugin, match these requirements!

What does the maven-junit4osgi-plugin provide?
  • Allows testing OSGi applications
  • Integrated in a Maven-based build process
  • Provides the same output as Surefire
  • Supports Maven site generation
Using the plug-in

Download and building the plug-in
The plug-in sources are available in the iPOJO trunk.
However the junit4osgi and iPOJO runtime are also required. So, download the source of iPOJO:
svn co http://svn.apache.org/repos/asf/felix/trunk/ipojo/
To compile it, run the following commands:
cd ipojo
mvn clean install –Pexamples
(the –Pexamples allows compiling iPOJO examples and so the junit4osgi framework ans the plug-in). Now you can use the plug-in in your project.

Simple configuration
So, first the project using the plug-in is not the project under test. It’s another project containing either only integration-test packaged in a bundle, or is empty (and so depends on other bundles containing integration tests).
Tests contained in the project are developed with junit4osgi, and are packaged in a bundle with the maven-bundle-plugin.
In the pom file, add the following plugin configuration to use the maven-junit4osgi-plugin:

<plugin>
<groupid>org.apache.felix</groupid>
<artifactid>maven-junit4osgi-plugin</artifactid>
<executions>
<execution>
<goals>
<goal>test</goal>
</goals>
<configuration>
<deployprojectartifact>true</deployprojectartifact>
</configuration>
</execution>
</executions>
</plugin>

Plugin parameter
The plug-in has only one parameter. The 'deployProjectArtifact' parameter enables or disables the current artifact deployment. If the current project contains tests, the plug-in can deploy the built artifact (as illustrated in this pom). Otherwise, the current project artifact is not deployed. This can be useful if the project just depends on other test bundles and sets the test configuration (as this pom).

Configuring the set of bundles to deploy

There is two different ways to configure the plug-in to deploy other bundles. If the bundle to deploy is a maven artifact, just add this artifact as a maven project dependency and set the dependency scope to ‘test’. Here is an example:

<dependency>
<artifactid>tests.manipulation.metadata</artifactid>
<groupid>ipojo.tests</groupid>
<version>1.1.0-SNAPSHOT</version>
<scope>test</scope>
</dependency>

If your bundle is not a maven artifact, you can configure the plugin with the bundle URL (from where the bundle will be deployed)

<configuration>
<deployprojectartifact>true</deployprojectartifact>
<bundles>
<param>file:/Users/clement/bundles/test-metadata.jar</param>
</bundles>
</configuration>


Set bundles are installed and started. You can depend on bundle that does not contain test as well as bundle containing tests.

Executing the plug-in

To execute test, just launch the ‘mvn clean integration-test’ command.

[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building iPOJO Primitive Manipulation Test Suite
[INFO] task-segment: [integration-test]
[INFO] ------------------------------------------------------------------------
[INFO] [resources:resources]
[INFO] Using default encoding to copy filtered resources.
[INFO] [compiler:compile]
[INFO] Nothing to compile - all classes are up to date
[INFO] [resources:testResources]
[INFO] Using default encoding to copy filtered resources.
[INFO] [compiler:testCompile]
[INFO] No sources to compile
[INFO] [surefire:test]
[INFO] No tests to run.
[INFO] [bundle:bundle]
[INFO] [ipojo:ipojo-bundle {execution: default}]
[INFO] Start bundle manipulation
[INFO] Metadata file : /Users/clement/Documents/workspaces/felix-trunk/ipojo/tests/manipulator/primitives/target/classes/metadata.xml
[INFO] Input Bundle File : /Users/clement/Documents/workspaces/felix-trunk/ipojo/tests/manipulator/primitives/target/tests.manipulation.primitives-1.1.0-SNAPSHOT.jar
[INFO] Bundle manipulation - SUCCESS
[INFO] [junit4osgi:test {execution: default}]
Analyzing org.apache.felix.ipojo - compile
Analyzing org.apache.felix.ipojo.metadata - compile
Analyzing org.osgi.core - compile
Analyzing junit - compile
Analyzing org.apache.felix.ipojo.junit4osgi - compile
Analyzing tests.manipulation.metadata - test

-------------------------------------------------------
T E S T S
-------------------------------------------------------
Deploy : /Users/clement/Documents/workspaces/felix-trunk/ipojo/tests/manipulator/primitives/target/tests.manipulation.primitives-1.1.0-SNAPSHOT.jar
Loading org.apache.felix.ipojo.test.scenarios.manipulation.ManipulationTestSuite
Loading org.apache.felix.ipojo.test.scenarios.manipulation.ManipulationTestSuite
Junit Extender starting ...
Running Manipulation Metadata Test Suite
Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 0 sec
Running Primitive Manipulation Test Suite
Tests run: 17, Failures: 0, Errors: 0, Time elapsed: 0 sec

Results :

Tests run: 33, Failures: 0, Errors:0

Unload test suites [class org.apache.felix.ipojo.test.scenarios.manipulation.ManipulationTestSuite]
Unload test suites [class org.apache.felix.ipojo.test.scenarios.manipulation.ManipulationTestSuite]
Cleaning test suites ...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESSFUL
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6 seconds
[INFO] Finished at: Mon Nov 10 21:30:21 CET 2008
[INFO] Final Memory: 9M/18M
[INFO] ------------------------------------------------------------------------

Failures and errors are reported in the plugin output.

Generating the report web page
When test are executed, the plug-in generates XML reports (int the target/junit4osgi-reports directory) using the same convention as Surefire. So, it is possible to configure Surefire to generate the web page with test results.
To do this, add the following report configuration to the project executing tests:

<reporting>
<plugins>
<plugin>
<groupid>org.apache.maven.plugins</groupid>
<artifactid>maven-surefire-report-plugin</artifactid>
<version>2.4.3</version>
<configuration>
<showsuccess>true</showsuccess>
<reportsdirectories>
<param>target/junit4osgi-reports</param>
</reportsdirectories>
</configuration>
</plugin>
</plugins>
</reporting>
This snippet configures the maven-surefire-report-plugin to collect results from the ‘target/junit4osgi-reports’ directory.
Then execute the plugin with the following command:
mvn org.apache.maven.plugins:maven-surefire-report-plugin:2.4.3:report

This command generates the web page with test results in ‘target/site’. This page shows an example of page generated with this command.

Plug-in design
The plug-in is quiet simple, it just starts an embedded Felix with a special activator installing and starting the junit4osgi framework and specified bundles.

Then, before executing test, the plug-in waits for “stability”. Indeed, as bundle activation can be asynchronous, the plug-in need to wait that the configuration is stable. Stability is obtained when all bundles are activated, and no new services appear or disappear on a 500 ms period. If after several second the stability cannot be reached, the plug-in stops.

Once the stability is reached, the junit4ogsi runner service is used to execute tests. Then results are collected and reports are generated.

Conclusion
This post has presented a front-end automating the execution of junit4osgi tests. Now it is possible to integrate OSGi application tests in a build process. The presented maven plugin provides following features:
  • An easy integration in a Maven-based build process
  • A good flexibility allowing reproducing production execution environments to test the application
  • Test result output is the same as surefire
  • Is able to generate Surefire-like reports

Sunday, November 2, 2008

Lessons learned from iPOJO testing process

Recently, we ask me several times if iPOJO is tied to the Felix runtime (i.e. works only on Felix). So, the answer is simple:

iPOJO relies only on the OSGi 4.1 specification, and so can work on any compliant implementation.

This post just explains how iPOJO is currently tested, and reflexion on OSGi application testing.

Test, Test and Test...

So, why testing?
Just because I'm very clumsy, and I can't guaranty that a patch don't have side effects broking a features. So, my tests aims to check that features still works and that I don't broke everything... That's why I developed junit4osgi at the same times as iPOJO. Before this improvement, testing iPOJO takes something like a working day! (now, it's close to 3 minutes...)

The iPOJO testing process weekly executed on three different OSGi implementations:
"Tested" means that the test suite is executed successfully on each implementation. This is done thanks to the junit4OSGi framework compatible with those implementations.
The test suite is also executed on different VMs such as Mika, JamVM and JRockit. Those tests are twofold:
  1. Checks that the iPOJO framework can be executed on different VMs.
  2. Checks that the iPOJO manipulation process generates consistent bytecode.
The second point is important, as the VMs are more or less tolerant to some mistakes ☺. Specially, Sun VMs are very tolerant!

So, let’s back to the testing process. The iPOJO test suite checks the most part of the iPOJO’s features. The iPOJO trunk contains a lot of test cases executed with the junit4osgi framework. Executing the test suite is quite simple: launch the OSGi implementation, launch the junit4osgi framework, deploy tests and finally use a front end (command line, GUI) to run tests.

Simple, isn’t it? However, during the development of the test suite, several issues appears:
- Handling asynchronous interactions
- Integrating test in the building process

Issues to test OSGi Applications

Testing services and their behavior is really easy with junit4OSGi. However, it becomes trickier when services realize asynchronous actions (i.e. actions are executed in a different thread and so not sequentially).

OSGi proposes an execution platform where a lot of actions are made asynchronously. Testing this kind of interaction is difficult, and generally requires magic waiting time.

For example, imagine a test pushing a configuration inside the configuration admin. The test creates a new configuration, pushes it to the configuration to the configuration admin, and checks that the configuration is correctly applied. However, the configuration admin applies the configuration in another thread. So the test must wait before checking that the configuration is correctly applied. Waiting is admitted, but how much time? Choosing a default time is rarely is good choice because it greatly depends on your configuration… Setting a long duration implies a long executing time…

So, you can argue that the Configuration Admin provide mechanisms notifying when configurations are correctly processed. That’s great, and the OSGi specification defines similar mechanisms for the most part of asynchronous interactions. So it should be possible to wait for this notification (or eventually throws a timeout).


However, if the ManagedService or the ManagedServiceFactory to update is not available, the notification is fired immediately, and is no more fired once the service arrives (at least on the Felix Configuration Admin). So, in this case you have to wait for an arbitrary time ☹. I’m looking for a better solution, but right now it still obscures to me how to handle those interactions cleanly.

The second issue with junit4osgi is very different. Executing tests are easy but I’m very (very) lazy. So, integrating those tests inside my build process could be great. An Ant task or a Maven plug-in could execute those tests automatically and react according to the results (if a failure or an error is detected, the build process fails). This automation can be easily implemented. It just requires starting an OSGi implementation, deploying the junit4osgi framework, starting required bundles as well as bundles containing tests and launching the test execution. It exist testing framework doing this, but they are generally usable only inside the build process and so lacks of flexibility. It must be possible to execute test manually on non-standard configurations. I’m going to implement a simple Ant task doing this. I’ll keep you posted…

Saturday, October 25, 2008

iPOJO 1.0.0 and the future

I recently released the version 1.0.0 of the iPOJO framework. This release is a major milestone for iPOJO.

iPOJO was created two years ago in the continuation of the Service Binder effort. The initial goal was to provide an easy way to create dynamic service-based applications on the top of OSGi without losing the OSGi philosophy (small, universal middleware).
At the same time, I was involved in several projects where I realized that iPOJO must focus on different goals:
  • Must be simple… (Avoids redundancies, provides annotations)
  • Must be extensible to tackle specific requirements
  • Must provide a way to design applications
The first requirement comes from the OSGi development model. OSGi is very powerful, but the learning curve is slow and long. A lot of mechanisms (mostly about class loading and threads) must be understood. Hiding or simplifying those mechanisms was a stringent requirement.

The first project using iPOJO was the design and the implementation of a residential gateway. However, we quickly understood that developing such kind of application requires some specific technical services such as a MOM allowing event-based interactions, a scheduler (i.e. Cron) to automate periodic task triggering, a way to administrate applications remotely… Providing such technical services is not too difficult. However, automating the interaction with those technical services was more challenging. From this observation, we decide to provide an extension mechanism allowing adding such kind of features without modifying the core of the project. So, iPOJO can be small as well as providing a lot of functionalities.

Finally, we realize that we need a way to design applications. Traditional ways to design applications are somewhat limited. Generally they are limited to static applications (with no possible evolution at runtime) and the architecture description is lost just after that the application is deployed. After a long work with Richard S. Hall, we propose and implement a new way to design applications on the top of OSGi. iPOJO composite allows isolating services, support dynamism … This is the most innovative part of the iPOJO project.
The iPOJO 1.0.0 release is the result of these two years of development, research, interrogations and experimentations. The iPOJO framework is composed by:
  • A core system providing basics functionalities (requiring/providing services, lifecycle, instance introspections…)
  • A full integration with the Configuration Admin
  • A way to design and execute dynamic applications
  • Several external handler extending core capabilities with JMX administration, Event Admin interactions, Temporal service dependency, whiteboard and extender patterns …
The future of iPOJO is also exciting. With the support of my ‘future former’ group, extensions such as distribution and deployment are under development. The distribution framework allows iPOJO instances to interact with remote services (using any protocol) and can be exposed remotely. The deployment support computes OBR descriptions from metadata providing features easing the deployment. Improvements such as providing a control and creation API are also investigated. The future version will see the arriving of new ‘handlers’ handling persistence and scheduling as well as a better integration with the junit4osgi framework.

All those stuffs are really promising… Isn’t it?

Tuesday, October 21, 2008

iPOJO on Android

Android is the Google OS for mobile phone (and the future GPhone). Android provides a Java-like virtual machine: Dalvik.

So, why not trying to execute an iPOJO-based application on the top of Android?
The idea was to embed an OSGi/iPOJO framework on Android and then to deploy an application on the top of the framework.

You can download the application in two parts:
the Android application embedding Felix/iPOJO : here
the SpellChecker application : here

The application is designed as follow:
  • The Android application creates and starts a OSGI/iPOJO framework (Felix + iPOJO + FileInstall)
  • The Android application tracks a ViewFactory service. This service is used to display the application GUI (the service allows creating the GUI main component).
  • The SpellChecker application (10 minutes tutorial) is deployed on OSGi, and use iPOJO. The only difference with the one form the 10 minutes tutorial is the GUI that uses Android HMI component (rather than Swing)
SpellChecker application bundles are deployed thanks to FileInstall. A specific folder is monitored in order to install/update/uninstall jar files contained in this folder.


Embedding Felix and iPOJO inside an Android Application


First, we need to create an Android application. This application creates a new Felix when the application is instantiated:

public synchronized void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.main);
PrintStream out = new PrintStream(new OutputStream(){
ByteArrayOutputStream output = new ByteArrayOutputStream();
@Override
public void write(int oneByte) throws IOException {
output.write(oneByte);
if (oneByte == '\n') {
Log.v("out", new String(output.toByteArray()));
output = new ByteArrayOutputStream();
}
}});
System.setErr(out);
System.setOut(out);
m_configMap = new StringMap(false);
m_configMap.put(FelixConstants.LOG_LEVEL_PROP,String.valueOf(Logger.LOG_DEBUG));
m_configMap.put(DirectoryWatcher.DEBUG, "1");
// Configure the Felix instance to be embedded.
m_configMap.put(FelixConstants.EMBEDDED_EXECUTION_PROP, "true");
File bundles = new File(FELIX_BUNDLES_DIR);
if (!bundles.exists()) {
if (!bundles.mkdirs()) {
throw new IllegalStateException("Unable to create bundles dir");
}
}
m_configMap.put(DirectoryWatcher.DIR, bundles.getAbsolutePath());
// Add core OSGi packages to be exported from the class path
// via the system bundle.
m_configMap.put(Constants.FRAMEWORK_SYSTEMPACKAGES, ANDROID_FRAMEWORK_PACKAGES);
// Explicitly specify the directory to use for caching bundles.
try {
m_cache = File.createTempFile("felix-cache", null);
} catch (IOException ex) {
throw new IllegalStateException(ex);
}
m_cache.delete();
m_cache.mkdirs();
m_configMap.put(BundleCache.CACHE_PROFILE_DIR_PROP, m_cache.getAbsolutePath());
}

When the application starts, the Felix Framework is started with two Activators :
  • one installing bundles from resources
  • one installing bundles from a folder (FileInstall)
Then a service tracker is used to track arrivals and departures of the ViewFactory (provided by the SpellChecker GUI).

public synchronized void onStart() {
super.onStart();
setContentView(new View(this));
Resources res = getResources();
try {
List<BundleActivator> activators = new ArrayList<BundleActivator>();

// Plugs the bundle installer (install bundle from application resources)
activators.add(new Installer(res));
// Plugs the FileInstall activator
activators.add(new FileInstall());

m_felix = new Felix(m_configMap, activators);
m_felix.start();
} catch (BundleException ex) {
throw new IllegalStateException(ex);
}
try {
m_tracker = new ServiceTracker(m_felix.getBundleContext(),
m_felix.getBundleContext().createFilter("(" + Constants.OBJECTCLASS
+ "=" + ViewFactory.class.getName() + ")"),
new ServiceTrackerCustomizer() {

@Override
public Object addingService(ServiceReference ref) {
System.out.println("======= Service found !");
final ViewFactory fac =
(ViewFactory) m_felix.getBundleContext().getService(ref);
if (fac != null) {
runOnUiThread(new Runnable() {
public void run() {
setContentView(fac.create(ApacheFelix.this));
}
});
}
return fac;
}

@Override
public void modifiedService(ServiceReference ref,
Object service) {
removedService(ref, service);
addingService(ref);
}

@Override
public void removedService(ServiceReference ref,
Object service) {
m_felix.getBundleContext().ungetService(ref);
runOnUiThread(new Runnable() {
public void run() {
setContentView(new View(ApacheFelix.this));
}
});
}});
m_tracker.open();
} catch (InvalidSyntaxException e) {
e.printStackTrace();
}
}

Note that the iPOJO bundle is deployed from application resource. The bundle was “dexed” and placed in the res/raw folder of the application project. Then, you can load the bundle with:

public void start(BundleContext arg0) throws Exception {

InputStream is = res.openRawResource(R.raw.ipojo);

Bundle bundle = arg0.installBundle(IPOJO_HTTP_PATH, is);
bundle.start();
}
Before the be installed, the application must be exported in a .apk file.
SpellChecker application
The application is basically the same than the one form the 10 minutes tutorial. There are two differences:
  • Bundles are “dexed”
  • The GUI bundle is implemented with Android GUI components
Another difference is about the optionality of the dependency on SpellChecker service. This dependency is now optional, and use two bind/unbind methods:
public synchronized void bindSpellChecker(SpellChecker sp) {
checker = sp;


if (activity != null) {
activity.runOnUiThread(new Runnable() {
public void run() {
// Enable button
m_button.setEnabled(true);
m_result.setText("Enter words, and click on 'check'");
m_main.invalidate();
System.out
.println("==> Spell checker GUI receives
a new spell checker ...");
}
});
}
}

public synchronized void unbindSpellChecker(SpellChecker sp) {
checker = null;
if (activity != null) {
activity.runOnUiThread(new Runnable() {
public void run() {
// Enable button
m_button.setEnabled(false);
m_result.setText("No Spellchecker available");
m_main.invalidate();
System.out
.println("=====> Spell checker GUI was
unbind from the spell checker");
}
});
}

Note that GUI modifications MUST be done in the UI Thread.


Starting the emulator


The application uses a SD card storage. So creates an SDCard iso:

mksdcard size filename

Then, launch the emulator (emulator must be accessible from your path) with:

emulator -shell -sdcard dev/android/sdcard1.iso

The emulator starts. When the shell is ready, change the permission access on /data/dalvik-cache:

chmod 777 /data/dalvik-cache

These permissions are needed to correctly load bundles.

Installing the application


To install the application (the .apk file), I use the Eclipse plugin (the apk file can be unsigned) or with the adb command (the apk file must be signed).
Once signed, you can deploy your application with:

adb install application.apk

Once installed, the application is available in the Android emulator menu

If you launch the application, a black screen appears … The Spellchecker application is not deployed!
So, to deploy the application, the bundles files must be placed in the /data/felix/bundles folder.
adb push SpellCheckGui\ for\ Android.jar /data/felix/bundles/SpellcheckGui.jar gui.jar
cd bundles
adb push spell.services.jar /data/felix/bundles/spell.services.jar

If you launch the application, the GUI appears, but you can’t use the check button. No spell checker services are available. Push the others jars in the same folder.
adb push spell.checker.jar \
/data/felix/bundles/spell.checker.jar
adb push spell.english.jar \
/data/felix/bundles/spell.english.jar

Once done, the check button becomes available. If you remove a the English dictionary, the check button becomes disabled:
adb shell rm /data/felix/bundles/spell.english.jar
adb push spell.english.jar /data/felix/bundles/spell.english.jar


Conclusion


This very simple application has shown that it is possible to create dynamic applications on the top of Android with OSGi and iPOJO. It sounds really promising …



References