Software Development

Source control your database with Liquibase

I recently started to use Liquibase on a project to keep track of the database changes in our Java Enterprise application. I must say that I like the way it works. It makes the deployment of my application (or a new release of it) to another environment easier and (more) fool proof. In the past I had to supply a database script to the DBA which had to be executed right after or before I redeployed my EAR/ WAR file with all issiues that comes with that procedure (script fails/ DBA not available, etc.). Now I won’t say there couldn’t be issues with this solution but from a developer perspective it does make life easier.

Here is how it works for a straight Maven project with some web services that are backed by a MySQL database. Since I deploy the web services in a WAR file on JBoss I chose to have the Liquibase scripts being triggered by a ServletContextListener instance. To be able to test the database script without having to deploy the application I also embedded the Maven plugin for Liquibase in my pom file. This way I can run the liquibase scripts manually with Maven against my local development database.

First add the necessary dependencies to the Maven pom.xml:

<dependency>                                           
	<groupId>org.liquibase</groupId>                      
	<artifactId>liquibase-core</artifactId>               
	<version>2.0.5</version>                              
</dependency>                                          
<dependency>                                           
    <groupId>mysql</groupId>                           
    <artifactId>mysql-connector-java</artifactId>      
    <version>5.1.15</version>                          
    <scope>provided</scope>                            
</dependency>

The MySQL dependency is only necessary to be able to run the Liquibase scripts with Maven. The application itself uses a datasource that is managed by the container in this case JBoss. To be able to run the Liquibase scripts with Maven I defined the plugin like this:

<plugin>                                                                       
    <groupId>org.liquibase</groupId>                                           
    <artifactId>liquibase-maven-plugin</artifactId>                            
    <version>2.0.1</version>                                                   
    <configuration>                                                            
        <changeLogFile>db.changelogs/my-changelog-master.xml</changeLogFile>   
        <driver>com.mysql.jdbc.Driver</driver>                                 
        <url>jdbc:mysql://localhost:3306/MyDB</url>                            
        <username>admin-user</username>                                              
        <password>*****</password>                                              
    </configuration>                                                           
    <executions>                                                               
        <execution>                                                            
            <goals>                                                            
                <goal>update</goal>                                            
            </goals>                                                           
        </execution>                                                           
    </executions>                                                              
</plugin>

Like I said I execute the scripts by triggering Liquibase with a Servlet Listener. To set this up add the following elements to the ‘web.xml’ inside the WAR file:

...
    <context-param>
        <param-name>liquibase.changelog</param-name>
        <param-value>db.changelogs/db.changelog-master.xml</param-value>
    </context-param>
    <context-param>
        <param-name>liquibase.datasource</param-name>
        <param-value>java:jboss/datasources/LiquibaseDS</param-value>
    </context-param>

    <listener>
        <listener-class>liquibase.integration.servlet.LiquibaseServletListener</listener-class>
    </listener>
...

Here we see that I defined two parameters:

    • liquibase.changelog

This parameter refers to the classpath location of the master db.changelogs file. It is important to notice that this path matches the one used in the Maven plugin if you are running against the same database. See this issue which I ran into when using different paths.

    • liquibase.datasource

This one refers to the JNDI name of the datasource to be used with the Liquibase script. Since this database user needs rights to create database obejcts like tables I suggest to create a separate datasource to use with the application. This datasource should then connect with a user that only has rights to access the data and not be able to create or drop objects.

The final part is the creation of the db.changelogs files. As suggested here as a best practice I refer to a ‘master’ changelog file in the configuration. In this master changelog file I refer to the individual scripts. For example I have the following ‘master’ script in my ‘resources’ folder:

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
    xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd">
    <include file="db.changelog-1.0.xml" relativeToChangelogFile="true" />
    <include file="db.changelog-2.0.xml" relativeToChangelogFile="true" />
</databaseChangeLog>

In the ‘db.changelog-1.0.xml’ script I create some tables with XML based changeset:

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
        xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd">
    <changeSet id="1" author="pascalalma">
        <createTable tableName="TEST_TABLE">
            <column name="ID" type="bigint(20)" autoIncrement="true">
                <constraints primaryKey="true" nullable="false"/>
            </column>
            <column name="CODE" type="varchar(20)">
                <constraints nullable="false" unique="true"/>
            </column>
            <column name="DESCRIPTION" type="varchar(200)"/>
            <column name="VALUE" type="varchar(200)"/>
        </createTable>
        <addUniqueConstraint columnNames="CODE,DESCRIPTION"
                             constraintName="CODE_UK"
                             tableName="TEST_TABLE"/>
        <createTable tableName="ANOTHER_TEST_TABLE">
            <column name="ID" type="bigint(20)" autoIncrement="true">
                <constraints primaryKey="true" nullable="false"/>
            </column>
            <column name="CODE" type="varchar(20)">
                <constraints nullable="false" unique="true"/>
            </column>
            <column name="TEST_CODE" type="bigint(20)">
            </column>
        </createTable>
        <addForeignKeyConstraint baseColumnNames="TEST_CODE"
                                 baseTableName="ANOTHER_TEST_TABLE"
                                 constraintName="TEST_OTHER_FK"
                                 referencedColumnNames="CODE"
                                 referencedTableName="TEST_TABLE"/>
    </changeSet>
</databaseChangeLog>

In this example script I show how to create tables with unique constrained on a combination of columns and how to add foreign keys to them.

In ‘db.changelog-2.0.xml’ I show how to load data into the created tables by using a CSV file as input:

<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog
        xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-2.0.xsd">
    <changeSet id="2" author="pascalalma">
    <loadData file="db.changelogs/mydata.csv"
              schemaName="MyDB"
              quotchar="'"
              tableName="TEST_TABLE"/>
    </changeSet>
</databaseChangeLog>

In the ‘mydata.csv’ I define the rows to be uploaded into the table:

CODE,DESCRIPTION,VALUE
'T01','Not specified','FOO'
'T02','Wrongly specified','BAR'
'T03','Correct','FOO-BAR'

The first line of the CSV file states the columns in the order the fields are defined in the other rows. This is a very convenient way to load initial data into your database tables. When all this is in place you can run Liquibase from Maven. Build the project first with ‘mvn clean install’ and then give a ‘mvn liquibase:update’:

screen-shot-2013-06-25-at-08-37-55

Now if we look at the result in the database scheme we see the following tables:

screen-shot-2013-06-25-at-08-39-11

We see our tables with the data are being created. We also see two more tables: ‘DATABASECHANGELOG’ and ‘DATABASECHANGELOGLOCK’. These are used by Liquibase to determine the state of the database. If I deploy the WAR file with a datasource against this database Liquibase will see that the scripts already have run and won’t execute them again. If I remove the tables however the scripts will be executed as it was against a new database. As you can see this can simplify the deployment of new releases of your software, at least the database part.
 

Pascal Alma

Pascal is a senior JEE Developer and Architect at 4Synergy in The Netherlands. Pascal has been designing and building J2EE applications since 2001. He is particularly interested in Open Source toolstack (Mule, Spring Framework, JBoss) and technologies like Web Services, SOA and Cloud technologies. Specialties: JEE, SOA, Mule ESB, Maven, Cloud Technology, Amazon AWS.
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Inline Feedbacks
View all comments
Back to top button