Sunday, November 06, 2005

Automate Your Builds and Tests

Automate Your Builds and Tests

This is going to be an ice-breaking post on my blog after a long silence. Couldn't write a post for some time as I was bit busy with a lot of work after coming back to Sri Lanka.

Last few days I was involved in automating build process of a J2EE project in Eurocenter. The project basically has a Struts based web front end, EJB/JDBC based db/business tier and a web service (XML over HTTP) interface for mobile devices. We have come up with an automation plan which will be implemented within next month and it seems progressing well curruntly.

So I thought of sharing some stuffs on the automation implementation of our project as it might help someone else in automating their products. As the first step we have to have a rough plan of our automation process. The plan used in our project goes as follows:

Write a one-step build script: Use Ant or Ant variants to come-up with a build file with will generate the complete build in one command. You may even consider using Ruby build scripts (Rake) for this purpose. (But we should be careful is selecting a immature technology for a core part like the build system of a project.

Add unit-test execution to the script: You should carefully select the unit tests which are properly written to run as stand alone tests. If you have followed my previous postings I have discussed some techniques explaining how to make your code testable. We can have an ant task which will run the unit test cases/suits on the compiled code.

Come up with a push button release: Once we have the build script we can extend it to produce single command releases. Basically this involves checking-out the code from a CVS tag and running the build task. This extended build script can be used by the build manager in producing builds for QA testing as well as producing production builds.

Schedule builds with automation engine: Here we go with automation. First we need to decide on a build engine. In the open source world there are several good automation engines such as ‘Cruise Control’, ‘AntHill’ and ‘BuildBot’. We have decided to go with ‘Cruise Control’ as it seems to be the most popular choice in the java community. In this step we configure our build engine to checkout all the code from CVS periodically (only if any modification is done after the last build) and build the product and run the unit tests. Once we have completed this step we track any build failures (such as compile errors) and unit test failures. In our project we have scheduled builds to run every one hour and this helps capturing introduced bugs within a period of one hour.

Establish reporting mechanisms: Through out the process we should be having basic notification mechanisms like Email and web console for reporting build status and failure notifications. All the build engines support these basic reporting mechanisms and all we have to do is to configure those services to suit the need. For an example if a build failed we can configure ‘Cruise Control’ to send notification emails to the developers who have committed code to CVS after the last successful build.

Automate integration testing: Here we attempt sub component integration in a test environment. In our project we wanted to make sure that the product components are integrating and deployable on the server. The integration testing will ensure that all the components are properly deployable (e.g. EJB testing) and the component communication channels (e.g. JMS queue testing, database connections) are established correctly. Most of the tests in this stage can be considered as in-container testing and test frameworks like ‘Cactus’ are good candidates to support the testing in this stage. In this stage we run “JBoss’ as our application server then deploy our application there and run integration test scripts.

Automate functional/acceptance testing: In this stage we add end-to-end testing to our automation. In our case basically this means performing tests on the web UI and web service interfaces. These test cases are written based on the end-user actions (based on use cases). Example test case is ‘user log-in to the system and changing his password’. In our product ‘HttpUnit’ test cases are written to simulate and validate user actions. Where ever response time is critical we may add performance test scripts written using ‘JMeter’, ‘JUnitPref’, etc… to ensure system response time.

Add code analyzers: Here with each build a code analyzer will inspect the code for “smelly code pieces”. With each build the team lead will get code analyzer report on any added/changed code fragments.

After coming up with an automation plan we can start implementing the plan. Once implemented, we get a set of actions which will run in a sequence in every scheduled build cycle. For you to get better understanding I will summarize the action steps we have in our project build cycle.

1. Checkout sources from CVS
2. Build the application from scratch
3. Execute build verification tests (unit tests)

4. Run code analyzer tool scripts

5. Install a ‘JBoss’ server instance

6. Configure the ‘JBoss’ instance
7. Create the test database

8. Populate test database with test data

9. Deploy the product over the JBoss/file system

10. Startup the server

11. Monitor log files for successful server startup

12. Execute integration test scripts

13. Execute acceptance test scripts

14. Shutdown the server

15. Clean up the build resources

16. Report build/test results

Once we got all the basics in place this is the time to play around extreme automation experiments. For example I have heard some project teams having flashing red bulbs in the development environment if a build failed. Trying these kind of extreme practices can bring team moral up towards testing/automation as well as can take attention of the rest of the project teams (in a good sense


Anonymous said...

Well done!
[url=]My homepage[/url] | [url=]Cool site[/url]

Anonymous said...

Great work!
My homepage | Please visit

Anonymous said...

Thank you! |