ApacheCON NA Vancouver 2011

ApacheCON NA 2011I’m in Vancouver for ApacheCON NA 2011, where I’ll be speaking on Friday 14:30 about DevOps, From Dev to DevOps, my take on DevOps for people like me interested in DevOps and automation coming from the dev side, and expanding the dev lifecycle all the way to deployment to production. I have previously posted the slides from a another event and will post the updated ones after the talk.

If you are at the conference, come over and say hi!

The DevOps movement aims to improve communication between developers and operations teams to solve critical issues such as fear of change and risky deployments. But the same way that Agile development would likely fail without continuous integration tools, the DevOps principles need tools to make them real, and provide the automation required to actually be implemented. Most of the so called DevOps tools focus on the operations side, and there should be more than that, the automation must cover the full process, Dev to QA to Ops and be as automated and agile as possible. Tools in each part of the workflow have evolved in their own silos, and with the support of their own target teams. But a true DevOps mentality requires a seamless process from the start of development to the end in production deployments and maintenance, and for a process to be successful there must be tools that take the burden out of humans.

Apache Maven has arguably been the most successful tool for development, project standardization and automation introduced in the last years. On the operations side we have open source tools like Puppet or Chef that are becoming increasingly popular to automate infrastructure maintenance and server provisioning.

In this presentation we will introduce an end-to-end development-to-production process that will take advantage of Maven and Puppet, each of them at their strong points, and open source tools to automate the handover between them, automating continuous build and deployment, continuous delivery, from source code to any number of application servers managed with Puppet, running either in physical hardware or the cloud, handling new continuous integration builds and releases automatically through several stages and environments such as development, QA, and production.

From Dev to DevOps slides from Agile Spain

Conferencia Agile Spain 2011Updated slides from my “From Dev to DevOps” presentation at the Agile Spain conference in Castellon on October 20th. Thanks to all the attendees for the questions and feedback!

Just some updates on Vagrant, VeeWee, Geppetto, and Puppet-Maven. Next stop, ApacheCON Vancouver!

UPDATE The video is also available (in Spanish) at the UJI web server as Flash and WMV.

From Dev to DevOps slides from Apache Barcamp Spain

Here are the slides from my “From Dev to DevOps” presentation at the Apache Barcamp Spain in Seville on October 8th. Not all that useful without the talking and hand waving 🙂

I’ll be presenting it too at the Agile Spain conference on Thursday, with new slides, and adding some more info on Vagrant, VeeWee, Geppetto, and Puppet-Maven, just ten days after, things evolve really fast! Then, on to present at ApacheCON in Vancouver.

I’ll hopefully find the time to publish here at some point, in the meantime, there’s a good summary about the tools, Setup your devops playground with Puppet, Vagrant & co by Arnaud Heritier.

Finding duplicate classes in your WAR files with Tattletale

Have you ever found all sorts of weird errors when running your webapp because several jar files included have the same classes in different versions and the wrong one is being picked up by the application server?

Using JBoss Tattletale tool and its Tattletale Maven plugin you can easily find out if you have duplicated classes in your WAR WEB-INF/lib folder and most importantly fail the build automatically if that’s the case before it’s too late and you get bitten in production.

Just add the following plugin configuration to your WAR pom build/plugins section. It can also be used for EAR, assemblies and other types of projects.

<plugin>
  <groupId>org.jboss.tattletale</groupId>
  <artifactId>tattletale-maven</artifactId>
  <version>1.1.0.Final</version>
  <executions>
    <execution>
      <phase>verify</phase> <!-- needs to run after WAR package has been built -->
      <goals>
        <goal>report</goal>
      </goals>
    </execution>
  </executions>
  <configuration>
    <source>${project.build.directory}/${project.build.finalName}/WEB-INF/lib</source>
    <destination>${project.reporting.outputDirectory}/tattletale</destination>
    <reports>
      <report>jar</report>
      <report>multiplejars</report>
    </reports>
    <profiles>
      <profile>java6</profile>
    </profiles>
    <failOnWarn>true</failOnWarn>
    <!-- excluding some jars, if jar name contains any of these strings it won't be analyzed -->
    <excludes>
      <exclude>persistence-api-</exclude>
      <exclude>xmldsig-</exclude>
    </excludes>
  </configuration>
</plugin>

You will need to add the JBoss Maven repository to your POM repositories section, or to your repository manager. Make sure you use the repository that only contains JBoss artifacts or you may experience conflicts between artifacts in that repo and the Maven Central repo.

Adding extra repositories is a common source of problems and makes builds longer (all repos are queried for artifacts). What I do is add an Apache Archiva proxy connector with a whitelist entry for org/jboss/** so the repo is only queried for org.jboss.* groupIds.

<repository>
  <id>jboss</id>
  <url>https://repository.jboss.org/nexus/content/repositories/releases</url>
  <releases>
    <enabled>true</enabled>
  </releases>
  <snapshots>
    <enabled>false</enabled>
  </snapshots>
</repository>

GPG, Maven and OS X

GPG on the Mac has been quite an issue always. Several choices to install and hard to configure. Now seems that GPG native tools for OS X are back to life at the GPGTools project, providing a single easy to use installer.

GPGTools is an open source initiative to bring OpenPGP to Apple OS X in the form of a single installer package

So I installed the package, logged out and in again for the PATHs to take effect, and got the agent up and running by executing

gpg-agent --daemon

(It will be automatically started when you restart)

Now, to configure Maven to use this GPG2 version and the GPG agent I added a profile to my ~/.m2/settings.xml

    <profile>
      <id>gpg</id>
      <activation>
        <activeByDefault>true</activeByDefault>
      </activation>
      <properties>
        <gpg.useagent>true</gpg.useagent>
        <gpg.executable>gpg2</gpg.executable>
      </properties>
    </profile>

This way the agent only prompts for the GPG key password once for each session, and Maven uses the right gpg executable.

Speaking at Javagruppen, the Danish JUG annual conference

The guys at Javagruppen, the Danish JUG, are doing their annual conference on February 11th and 12th.

The theme for this year is “Java, a cloudy affair”, and I’ll be speaking on building and testing in the cloud, using Apache Maven, Continuum, TestNG, Selenium,… and how to take full advantage of cloud features for software development, aligned with my previous talks.

This year the conference will be in a 5-star hotel and spa in the middle of Denmark, and I gotta say I look forward to it, seems they know how to choose a location (last year they did it at a Castle).

You can still sign up if you want to go.

Comwell Kellers Park

Github forking after cloning

When you want to change/contribute to other people’s projects that you don’t have access to you usually fork the project, and then use your read+write fork.

What if you first cloned their repo and made local commits that you now want to contribute? You don’t want to mess with patches, so here’s what I did to contribute a small fix to the example project from Apache Maven 2 Effective Implementation, a great book by Brett Porter and Maria Odea Ching.

  1. Fork the project repo at Github (at https://github.com/brettporter/centrepoint/)
  2. In my local clone, I renamed the remote origin to upstream
  3. Add a new remote called origin pointing to the read+write fork
  4. Change the master branch remote to origin instead of upstream
  5. Fetch the remote and push your changes
git remote rename origin upstream
git remote add origin git@github.com:carlossg/centrepoint.git
git fetch origin
git branch --set-upstream master origin/master
git push origin

Maven 3.0 released!

Maven 3.0 is finally out after a long long time in progress!

What’s new?

Behind the scenes a LOT has changed, but for a Maven user or plugin developer you shouldn’t see many differences. Particularly, backwards compatibility was a must for this release.

New features include:

  • Better POM validation and warning/error messages. Pay attention at the beginning of the build where you can see notices about your POM configuration.
  • Parallel builds. Use several threads to build multiproject POMs, analyzing the dependencies between modules to determine the ordering.
  • Stability and predictability, changes in classloading, dependency ordering and multiproject building make the build to behave more consistently.

Changes include:

  • No more site plugin as you know it. Configure the new Maven Site plugin, or better, install Sonar (highly recommended).
  • profiles.xml is no longer used
  • Maven 1 repository layouts are no longer supported

Read all the release notes and compatibility notes.

Upgrade!

  1. Download Maven 3.
  2. Check compatibility notes.
  3. Upgrade the plugins to compatible versions if needed.
  4. Configure the new Maven Site plugin, or move to Sonar.

See other notes on Maven 3 from Brett Porter, and if you are going to ApacheCON, he will be giving a training session covering Maven 3.0 too.

Maven, Amazon EC2 and SpringSource Cloud Foundry

You may have heard about the just announced SpringSource Cloud Foundry and how it is based on the CloudTools project, that includes a Maven plugin to deploy Java EE applications to Amazon EC2, starting the images as part of the build process.

Some time ago I started another Maven plugin, the Amazon EC2 Maven plugin, which allows you to start and stop EC2 AMIs as part of your build process. Unlike CloudTools, it’s a lower level plugin that can start any AMI, a very different goal.

My use case? starting Selenium Grid Remote Control images for different environments and browsers before the integration tests start, wait for the images to be online, run the integration tests, and shutdown the images. Check my previous Enterprise Build and Test in the Cloud entry for more details.

You could also have your AMIs with your webserver, db,… pre-installed, start it, deploy using the Maven Cargo plugin to any container of your choice, and shutdown the image at the end of the tests.

The plugin allows all the configuration options than the EC2 API does, because it’s based on the Typica EC2 library. Start any number of images, associate elastic IPs, choose availability zones,…

Hope you find it useful.

Functional testing with Maven, Cargo and Selenium

Setting up automated functional integration tests is not too hard if you have the right tools. It can take you a bit of time to setup but in the long run you’ll benefit from reduced QA times, reduced risks, a more confident development team, the ability to do safe refactorings, and many more advantages.

I’m going to explain how Maven, Selenium, Cargo and JBoss 4.2 can be setup to run automatically in a continuous integration server such as Continuum customizing the server configuration as needed and deploying any webapp automatically. Every time the webapp is changed the CI server will execute the tests against the latest version ensuring you are always in a safe state.

The biggest difference with other tutorials I’ve found is that most of them cover just Jetty and are not updated to the latest versions of libraries and tools, so here it is my contribution.

Architecture

  • A new project is setup with dependencies to the war project to be tested. Also required a dependency to selenium java client.
  • Cargo will download and install the application server (JBoss)
  • We will copy any required configuration and libraries (ie. jdbc driver)
  • Cargo will start the application server
  • The Selenium server is started
  • Surefire executes the junit tests that interact with the selenium server and test the running app
  • Cargo will stop the app server

We use profiles to enable different combination of browser/application server. By default cargo uses jetty.

Config Profiles
JBoss 4.2 and Firefox (default) -Pjboss42x,firefox
JBoss 4.2 and Internet Explorer -Pjboss42x,iexplore
Jetty and Firefox -Pfirefox
Jetty and Internet Explorer -Piexplore

The POM

Dependencies

<dependencies>
    <dependency>
      <groupId>com.acme</groupId>
      <artifactId>mywebapp</artifactId>
      <version>${project.version}</version>
      <type>war</type>
    </dependency>
    <!-- the jdbc driver we need to copy to the appserver -->
    <dependency>
      <groupId>mysql</groupId>
      <artifactId>mysql-connector-java</artifactId>
    </dependency>
    <dependency>
      <groupId>org.openqa.selenium.client-drivers</groupId>
      <artifactId>selenium-java-client-driver</artifactId>
      <version>1.0-SNAPSHOT</version> <!-- required for firefox 3 else use 1.0-beta-1 -->
      <scope>test</scope>
    </dependency>
  </dependencies>

Properties used in several places

Ports, where to uncompress the application server,…

<properties>
    <cargo.install.directory>${project.build.directory}/installs</cargo.install.directory>
    <selenium.port>14444</selenium.port>
    <servlet.port>18880</servlet.port>
    <selenium.background>true</selenium.background>
  </properties>

Plugin configuration

JDBC driver

Copy mysql jdbc driver to the app server lib folder

<plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-dependency-plugin</artifactId>
        <executions>
          <execution>
            <id>copy-jdbc-lib</id>
            <phase>generate-resources</phase>
            <goals>
              <goal>copy-dependencies</goal>
            </goals>
            <configuration>
              <includeGroupIds>mysql</includeGroupIds>
              <outputDirectory>${lib.target}</outputDirectory>
            </configuration>
          </execution>
        </executions>
      </plugin>

Cargo

Install the application server in an early phase so we can customize it with our configuration files (see profiles). Then start before integration tests and stop afterwards. Parameters are used so different profiles can use different application servers.

      <plugin>
        <groupId>org.codehaus.cargo</groupId>
        <artifactId>cargo-maven2-plugin</artifactId>
        <executions>
          <execution>
            <id>install</id>
            <phase>generate-resources</phase>
            <goals>
              <goal>install</goal>
            </goals>
          </execution>
          <execution>
            <id>start-container</id>
            <phase>pre-integration-test</phase>
            <goals>
              <goal>start</goal>
            </goals>
            <configuration>
              <wait>false</wait>
            </configuration>
          </execution>
          <execution>
            <id>stop-container</id>
            <phase>post-integration-test</phase>
            <goals>
              <goal>stop</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <container>
            <containerId>${container.name}</containerId>
            <zipUrlInstaller>
              <url>${container.url}</url>
              <installDir>${cargo.install.directory}/${container.name}</installDir>
            </zipUrlInstaller>
            <log>${project.build.directory}/logs/${container.name}.log</log>
            <output>${project.build.directory}/logs/${container.name}.out</output>
            <timeout>600000</timeout>
          </container>
          <configuration>
            <!--
            <home>${project.build.directory}/${container.name}conf</home>
            <type>existing</type>
            -->
            <properties>
              <cargo.servlet.port>${servlet.port}</cargo.servlet.port>
              <cargo.jboss.configuration>default</cargo.jboss.configuration>
              <cargo.rmi.port>1099</cargo.rmi.port>
            </properties>

            <deployables>
              <!-- application to deploy -->
              <deployable>
                <groupId>com.acme</groupId>
                <artifactId>mywebapp</artifactId>
                <type>war</type>
                <properties>
                  <context>acontext</context>
                </properties>
              </deployable>
            </deployables>
          </configuration>
        </configuration>
      </plugin>

Selenium

Make surefire skip tests during test phase and run them in the integration-test phase. Pass some properties as system properties so they are accessible from the junit test case.

      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <configuration>
          <!-- Skip the normal tests, we'll run them in the integration-test phase -->
          <skip>true</skip>
          <systemProperties>
            <property>
              <name>browser</name>
              <value>${browser}</value>
            </property>
            <property>
              <name>servlet.port</name>
              <value>${servlet.port}</value>
            </property>
            <property>
              <name>selenium.port</name>
              <value>${selenium.port}</value>
            </property>
          </systemProperties>
        </configuration>
        <executions>
          <execution>
            <phase>integration-test</phase>
            <goals>
              <goal>test</goal>
            </goals>
            <configuration>
              <skip>false</skip>
            </configuration>
          </execution>
        </executions>
      </plugin>
      <plugin>
        <groupId>org.codehaus.mojo</groupId>
        <artifactId>selenium-maven-plugin</artifactId>
        <!-- to run headless in a Unix server with a virtual framebuffer X server Xvfb
             you need to call first the goal selenium:xvfb ie. "mvn clean selenium:xvfb install"
             see http://mojo.codehaus.org/selenium-maven-plugin/examples/headless-with-xvfb.html -->
        <executions>
          <execution>
            <id>start-selenium</id>
            <phase>pre-integration-test</phase>
            <goals>
              <goal>start-server</goal>
            </goals>
          </execution>
        </executions>
        <configuration>
          <background>${selenium.background}</background>
          <port>${selenium.port}</port>
          <logOutput>true</logOutput>
        </configuration>
      </plugin>

Application server profiles

We can configure a different profile for each application server and set some specific application server configuration.

<profiles>
    <profile>
      <id>jboss42x</id>
      <activation>
        <activeByDefault>true</activeByDefault>
      </activation>
      <properties>
        <container.name>jboss42x</container.name>
        <container.url>http://internap.dl.sourceforge.net/sourceforge/jboss/jboss-4.2.1.GA.zip</container.url>
        <jboss.version>4.2.1.GA</jboss.version>
        <jboss.conf.directory>${cargo.install.directory}/${container.name}/jboss-${jboss.version}/jboss-${jboss.version}/server/default</jboss.conf.directory>
        <lib.target>${jboss.conf.directory}/deploy/lib</lib.target>
        <war.target>${jboss.conf.directory}/deploy</war.target>
      </properties>

      <dependencies>
        <dependency>
          <groupId>org.jboss</groupId>
          <artifactId>jboss</artifactId>
          <version>${jboss.version}</version>
          <type>zip</type>
          <scope>test</scope>
        </dependency>
      </dependencies>
      <build>
        <plugins>
          <!-- copy to the application server directory any customized configuration files that we need -->
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-antrun-plugin</artifactId>
            <executions>
              <execution>
                <phase>process-resources</phase>
                <configuration>
                  <tasks>
                    <copy todir="${jboss.conf.directory}" overwrite="true">
                      <fileset dir="${basedir}/src/test/${container.name}"/>
                    </copy>
                  </tasks>
                </configuration>
                <goals>
                  <goal>run</goal>
                </goals>
              </execution>
            </executions>
          </plugin>
        </plugins>
      </build>
    </profile>

Browser profiles

As with the application servers we have a profile for each browser

<profile>
      <id>firefox</id>
      <activation>
        <activeByDefault>true</activeByDefault>
      </activation>
      <properties>
        <browser>*firefox</browser>
      </properties>
    </profile>
    <profile>
      <id>iexplore</id>
      <properties>
        <browser>*iexplore</browser>
      </properties>
    </profile>
    <profile>
      <id>otherbrowser</id>
      <properties>
        <browser>*custom ${browserPath}</browser>
      </properties>
    </profile>

Enabling testing during development

Make selenium not to run in the background so we can execute tests from the IDE

    <profile>
      <id>dev</id>
      <properties>
        <selenium.background>false</selenium.background>
      </properties>
    </profile>

Repositories

Required for Selenium dependencies

<repositories>
    <repository>
      <id>openqa.org</id>
      <name>OpenQA Repository</name>
      <url>http://archiva.openqa.org/repository/releases</url>
      <snapshots>
        <enabled>false</enabled>
      </snapshots>
      <releases>
        <enabled>true</enabled>
      </releases>
    </repository>
    <!-- for selenium 1.0-SNAPSHOT -->
    <repository>
      <id>snapshots.openqa.org</id>
      <name>OpenQA Sanpshots Repository</name>
      <url>http://archiva.openqa.org/repository/snapshots</url>
      <snapshots>
        <enabled>true</enabled>
      </snapshots>
      <releases>
        <enabled>false</enabled>
      </releases>
    </repository>
  </repositories>

Running in the build server

In an Unix server without X running you can still run Selenium tests using Xvfb (virtual framebuffer X server) by calling selenium:xvfb provided it’s properly configured.

Also you can pass the path to the browser binary if not in the PATH

mvn clean selenium:xvfb install -Dbrowser="*firefox /usr/lib64/firefox-1.5.0.12/firefox-bin"

The JUnit test

public class SeleniumHelloWorldTest
    extends TestCase
{
    private DefaultSelenium selenium;

    private String baseUrl;

    @Override
    public void setUp()
        throws Exception
    {
        super.setUp();
        String port = System.getProperty( "servlet.port" );
        baseUrl = "http://localhost:" + port;
        selenium = createSeleniumClient( baseUrl );
        selenium.start();
    }

    @Override
    public void tearDown()
        throws Exception
    {
        selenium.stop();
        super.tearDown();
    }

    protected DefaultSelenium createSeleniumClient( String url )
        throws Exception
    {
        String browser = System.getProperty( "browser" );
        String port = System.getProperty( "selenium.port" );
        return new DefaultSelenium( "localhost", Integer.parseInt( port ), browser, url );
    }

    public void testHelloWorld()
        throws Exception
    {
        selenium.open( baseUrl + "/mycontext/" );
        assertTrue( selenium.isTextPresent( "acme" ) );
    }
}

Debugging and troubleshooting (update)

You can check JBoss logs in target/logs/jboss42x.out and Selenium server logs in target/selenium/server.log

References

Other wiki entries and blogs