Unfortunately, one of the drawbacks of the approach described is that it requires you to write some custom JDBC code to clear out your tables after each unit test. From the article:
statement.executeUpdate("delete from Batting");
statement.executeUpdate("delete from Fielding");
statement.executeUpdate("delete from Pitching");
statement.executeUpdate("delete from Player");
connection.commit();
Another possible option for handling this scenario if you are using Hibernate 3 is to use it's bulk update feature. I haven't used this feature yet, but apparently one wrinkle is that bulk deletes don't seem to cause cascade deletes. This requires that you remember to update your code each time you add a new table to those used by your application, or end up with problems in your unit tests due to data left over from the last test method.
But in my opinion there's a better way: leverage the SchemaExport utility of Hibernate to automaticallly wipe out and recreate all your tables for you. The basic idea is that the setUp method of your unit test calls the SchemaExport utility to wipe out the existing tables and recreate them. This leaves you with the proper tables and a clean slate upon which to create exactly the objects needed to run your test. This simplifies your unit testing and avoids having to maintain both the mapping files and the bulk delete code in tandem.
To accomplish this, you need to make sure your Hibernate mapping files specify enough information for Hibernate's SchemaExport utility to generate and run database-specific DDL to recreate your tables, indexes, foreign keys, etc. For example, you might have a mapping file for a SimplePerson class that looks like this:
<?xml version="1.0"?>
<!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD/EN"
"http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd">
<hibernate-mapping package="com.acme.project">
<class name="SimplePerson" table="PERSONS">
<id name="id" type="int">
<column name="CODE" not-null="true"/>
<generator class="native"/>
</id>
<property name="firstName" column="FIRST_NAME" not-null="true"/>
<property name="lastName" column="LAST_NAME" not-null="true"/>
<property name="contactEmail" column="CONTACT_EMAIL_ADDRESS"/>
<property name="disabled" type="boolean">
<column name="DISABLE" default="0" not-null="true"/>
</property>
</class>
</hibernate-mapping>
You need to make sure you have information about the SQL types of your mapped columns so Hibernate knows how to generate the tables, like this:
<?xml version="1.0"?>
<!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD/EN"
"http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd">
<hibernate-mapping package="com.acme.project">
<class name="SimplePerson" table="PERSONS">
<id name="id" type="int">
<column name="CODE" sql-type="integer" not-null="true"/>
<generator class="native"/>
</id>
<property name="firstName" column="FIRST_NAME" not-null="true"/>
<property name="lastName" column="LAST_NAME" not-null="true"/>
<property name="contactEmail" column="CONTACT_EMAIL_ADDRESS"/>
<property name="disabled" type="boolean">
<column name="DISABLE" sql-type="int" default="0" not-null="true"/>
</property>
</class>
</hibernate-mapping>
The sql-type attributes are what tell Hibernate how to generate DDL for your table. Once you have this, you can invoke the SchemaExport utility on your in-memory HSQLDB database and it will wipe out the old tables and their contents and recreate them empty.
Now our only catch is invoking SchemaExport when using the Spring framework. If you're using Spring outside of any other container, you might use a LocalSessionFactoryBean to do your Hibernate mapping work, like this:
<bean id="mySessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
<property name="mappingResources">
<list>
<value>com/acme/project/SimplePerson.hbm.xml</value>
<!-- ... -->
</list>
</property>
<property name="configLocation" value="hibernateConfig.xml"/>
</bean>
This is great, except for one thing: you need access to the Hibernate Configuration object in order to invoke SchemaExport and this is not easily available (from what I can find). There is nice workaround for this problem, though. If you use the ampersand ('&') character in front of the name of a bean, you can get at the actual Spring object instead of the bean it creates. For example, the LocalSessionFactoryBean has a public getConfiguration method will get us the Hibernate Configuration object we want. So we can implement a setUp method that contains code like this:
LocalSessionFactoryBean l = (LocalSessionFactoryBean) factory.getBean("&mySessionFactory");
Configuration cfg = l.getConfiguration();
SchemaExport schemaExport = new SchemaExport(cfg);
schemaExport.create(false, true);
Now, every test method in your unit test will have a clean database with all the right tables, indexes, etc.
One side benefit of this is that there's a good chance you will make the tests run faster than when connecting to an external database. But, most importantly, by avoiding dependencies on an external database you improve the reliability of your Hibernate mapping unit tests.
6 comments:
The ampersand is neat, I was not aware of that. Could you also just map the SchemaExport in Spring, so you just get it directly? (I suppose the downside is now your Spring mapping files would have a SchemaExport bean, which you probably only need during unit testing)
I know far too little about Hibernate and Spring to judge the majority of this but I did want to note that dbunit has some nice utilities for loading and restoring data during unit tests.
What Alex said--we use dbunit for this. It can be a little tricky because you have reset tables in the right order to avoid constraint errors.
Good post. Thought I'd offer another variation on this I use. Our data modeling tool (E/R studio) can produce schema drop/create scripts, so we script db creation/delete with ant sql tasks, but use hibernate to actually create test fixtures. Spring takes care of rolling back all the unit tests so the database is always clean.
--JL
First off, it's arguable whether or not a test that deals with Hibernate and the database is indeed a "unit" test. At any rate, for the type of test you want to do, I agree with Alex -- DbUnit is the way to go. Barring that, using transactions and rolling them back instead of committing them is another popular way to do this (see Spring). Dropping and recreating the tables for each test seems like overkill. Of course, that's just my opinion. I could be wrong.
Jamie
DBUnit would be the way to go, but I feel you are moving in the realm of functional testing vs unit testing. If you are unit testing a single DAO, this is out of scope. I would suggest renaming to Functional tests.
Post a Comment