Matt RaibleMatt Raible is a Web Developer and Java Champion. Connect with him on LinkedIn.

The Angular Mini-Book The Angular Mini-Book is a guide to getting started with Angular. You'll learn how to develop a bare-bones application, test it, and deploy it. Then you'll move on to adding Bootstrap, Angular Material, continuous integration, and authentication.

Spring Boot is a popular framework for building REST APIs. You'll learn how to integrate Angular with Spring Boot and use security best practices like HTTPS and a content security policy.

For book updates, follow @angular_book on Twitter.

The JHipster Mini-Book The JHipster Mini-Book is a guide to getting started with hip technologies today: Angular, Bootstrap, and Spring Boot. All of these frameworks are wrapped up in an easy-to-use project called JHipster.

This book shows you how to build an app with JHipster, and guides you through the plethora of tools, techniques and options you can use. Furthermore, it explains the UI and API building blocks so you understand the underpinnings of your great application.

For book updates, follow @jhipster-book on Twitter.

10+ YEARS


Over 10 years ago, I wrote my first blog post. Since then, I've authored books, had kids, traveled the world, found Trish and blogged about it all.

Integration Testing with HTTP, HTTPS and Maven

Earlier this week, I was tasked with getting automated integration tests working in my project at Overstock.com. By automated, I mean that ability to run "mvn install" and have the following process cycled through:

  • Start a container
  • Deploy the application
  • Run all integration tests
  • Stop the container

Since it makes sense for integration tests to run in Maven's integration-test phase, I first configured the maven-surefire-plugin to skip tests in the test phase and execute them in the integration-test phase. I used the <id>default-phase</id> syntax to override the plugins' usual behavior.

<plugin>
  <artifactId>maven-surefire-plugin</artifactId>
  <executions>
    <execution>
      <id>default-test</id>
      <configuration>
        <excludes>
          <exclude>**/*Test*.java</exclude>
        </excludes>
      </configuration>
    </execution>
    <execution>
      <id>default-integration-test</id>
      <phase>integration-test</phase>
      <goals>
        <goal>test</goal>
      </goals>
      <configuration>
        <includes>
          <include>**/*Test.java</include>
        </includes>
        <excludes>
          <exclude>none</exclude>
          <exclude>**/TestCase.java</exclude>
        </excludes>
      </configuration>
    </execution>
  </executions>
</plugin>

After I had this working, I moved onto getting the container started and stopped properly. In the past, I've done this using Cargo and it's always worked well for me. Apart from the usual setup I use in AppFuse archetypes (example pom.xml), I added a couple additional items:

  • Added <timeout>180000</timeout> so the container would wait up to 3 minutes for the WAR to deploy.
  • In configuration/properties, specified <context.path>ROOT</context.path> so the app would deploy at the / context path.
  • In configuration/properties, specified <cargo.protocol>https</cargo.protocol> since many existing unit tests made requests to secure resources.

I started by using Cargo with Tomcat and had to create certificate keystore in order to get Tomcat to start with SSL enabled. After getting it to start, I found the tests failed with the following errors in the logs:

javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: 
PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: 
unable to find valid certification path to requested target
	at com.sun.net.ssl.internal.ssl.Alerts.getSSLException(Alerts.java:174)
	at com.sun.net.ssl.internal.ssl.SSLSocketImpl.fatal(SSLSocketImpl.java:1649)

Co-workers told me this was easily solved by adding my 'untrusted' cert to my JVM keystore. Once all this was working, I thought I was good to go, but found that some tests were still failing. The failures turned out to be because they were talking to http and https was the only protocol enabled. After doing some research, I discovered that Cargo doesn't support starting on both http and https ports.

So back to the drawing board I went. I ended up turning to the maven-jetty-plugin and the tomcat-maven-plugin to get the functionality I was looking for. I also automated the certificate keystore generation using the keytool-maven-plugin. Below is the extremely-verbose 95-line profiles section of my pom.xml that allows either container to be used.

Sidenote: I wonder how this same setup would look using Gradle?

<profiles>
  <profile>
    <id>jetty</id>
    <activation>
      <activeByDefault>true</activeByDefault>
    </activation>
    <build>
      <plugins>
        <plugin>
          <groupId>org.mortbay.jetty</groupId>
          <artifactId>maven-jetty-plugin</artifactId>
          <version>6.1.26</version>
          <configuration>
            <contextPath>/</contextPath>
            <connectors>
              <connector implementation="org.mortbay.jetty.nio.SelectChannelConnector">
                <!-- forwarded == true interprets x-forwarded-* headers -->
                <!-- http://docs.codehaus.org/display/JETTY/Configuring+mod_proxy -->
                <forwarded>true</forwarded>
                <port>8080</port>
                <maxIdleTime>60000</maxIdleTime>
              </connector>
              <connector implementation="org.mortbay.jetty.security.SslSocketConnector">
                <forwarded>true</forwarded>
                <port>8443</port>
                <maxIdleTime>60000</maxIdleTime>
                <keystore>${project.build.directory}/ssl.keystore</keystore>
                <password>overstock</password>
                <keyPassword>overstock</keyPassword>
              </connector>
            </connectors>
            <stopKey>overstock</stopKey>
            <stopPort>9999</stopPort>
          </configuration>
          <executions>
            <execution>
              <id>start-jetty</id>
              <phase>pre-integration-test</phase>
              <goals>
                <goal>run-war</goal>
              </goals>
              <configuration>
                <daemon>true</daemon>
              </configuration>
            </execution>
            <execution>
              <id>stop-jetty</id>
              <phase>post-integration-test</phase>
              <goals>
                <goal>stop</goal>
              </goals>
            </execution>
          </executions>
        </plugin>
      </plugins>
    </build>
  </profile>
  <profile>
    <id>tomcat</id>
    <build>
      <plugins>
        <plugin>
          <groupId>org.codehaus.mojo</groupId>
          <artifactId>tomcat-maven-plugin</artifactId>
          <version>1.1</version>
          <configuration>
            <addContextWarDependencies>true</addContextWarDependencies>
            <fork>true</fork>
            <path>/</path>
            <port>8080</port>
            <httpsPort>8443</httpsPort>
            <keystoreFile>${project.build.directory}/ssl.keystore</keystoreFile>
            <keystorePass>overstock</keystorePass>
          </configuration>
          <executions>
            <execution>
              <id>start-tomcat</id>
              <phase>pre-integration-test</phase>
              <goals>
                <goal>run-war</goal>
              </goals>
            </execution>
            <execution>
              <id>stop-tomcat</id>
              <phase>post-integration-test</phase>
              <goals>
                <goal>shutdown</goal>
              </goals>
            </execution>
          </executions>
        </plugin>
      </plugins>
    </build>
  </profile>
</profiles>

With this setup in place, I was able to automate running our integration tests by simply typing "mvn install" (for Jetty) or "mvn install -Ptomcat" (for Tomcat). For running in Hudson, it's possible I'll have to further enhance things to randomize the port and pass that into tests as a system property. The build-helper-maven-plugin and its reserve-network-port goal is a nice way to do this. Note: if you want to run more than one instance of Tomcat at a time, you might have to randomize the ajp and rmi ports to avoid collisions.

The final thing I encountered was our app didn't shutdown gracefully. Luckily, this was fixed in a newer version of our core framework and upgrading fixed the problem. Here's the explanation from an architect on the core framework team.

The hanging problem was caused by the way the framework internally aggregated statistics related to database connection usage and page response times. The aggregation runs on a separate thread but not as a daemon thread. Previously, the aggregation threads weren't being terminated on shutdown so the JVM would hang waiting for them to finish. In the new frameworks, the aggregation threads are terminated on shutdown.

Hopefully this post helps you test your secure and unsecure applications at the same time. At the same time, I'm hoping it motivates the Cargo developers to add simultaneous http and https support. ;)

Update: In the comments, Ron Piterman recommended I use the Maven Failsafe Plugin because its designed to run integration tests while Surefire Plugin is for unit tests. I changed my configuration to the following and everything still passes. Thanks Ron!

<plugin>
  <artifactId>maven-surefire-plugin</artifactId>
  <version>2.7.2</version>
  <configuration>
    <skipTests>true</skipTests>
  </configuration>
</plugin>
<plugin>
  <artifactId>maven-failsafe-plugin</artifactId>
  <version>2.7.2</version>
  <configuration>
    <includes>
      <include>**/*Test.java</include>
    </includes>
    <excludes>
      <exclude>**/TestCase.java</exclude>
    </excludes>
  </configuration>
  <executions>
    <execution>
      <id>integration-test</id>
      <phase>integration-test</phase>
      <goals>
        <goal>integration-test</goal>
      </goals>
    </execution>
    <execution>
      <id>verify</id>
      <phase>verify</phase>
      <goals>
        <goal>verify</goal>
      </goals>
    </execution>
  </executions>
</plugin>

Update 2: In addition to application changes to solve hanging issues, I also had to change my Jetty Plugin configuration to use a different SSL connector implementation. This also required adding the jetty-sslengine dependency, which has been renamed to jetty-ssl for Jetty 7.

<connector implementation="org.mortbay.jetty.security.SslSelectChannelConnector">
...
<dependencies>
  <dependency>
    <groupId>org.mortbay.jetty</groupId>
    <artifactId>jetty-sslengine</artifactId>
    <version>6.1.26</version>
  </dependency>
</dependencies>

Posted in Java at Feb 11 2011, 03:54:16 PM MST 9 Comments

Making Code Generation Smarter with Maven

As you might've read in my last entry, I recently started a new gig with Overstock.com. On my first day, I was quickly immersed into the development process by joining the Conversion Team. The Conversion Team is responsible for developing the checkout UI and handling payments from customers. I quickly discovered Overstock was mostly a Linux + Eclipse Shop and did my best to get my favorite Mac + IntelliJ + JRebel installed and configured. Thanks to my new Team Lead, I was able to get everything up and running the first day, as well as checkin my first contribution: making mvn jetty:run work so I didn't have to use my IDE to deploy to Tomcat.

In setting up my environment, I couldn't help but notice running jetty:run took quite a while to run each time. Specifically, the build process took 45 seconds to start executing the Jetty plugin, then another 23 seconds to startup after that. The first suspicious thing I noticed was that the UI templates were being re-generated and compiled on each execution. The UI Templating Framework at Overstock is Jamon, and is described as follows:

Jamon is a text template engine for Java, useful for generating dynamic HTML, XML, or any text-based content. In a typical Model-View-Controller architecture, Jamon clearly is aimed at the View (or presentation) layer.

Because it is compiled to non-reflective Java code, and statically type-checked, Jamon is ideally suited to support refactoring of template-based UI applications. Using mock objects -like functionality, Jamon also facilitates unit testing of the controller and view.

To generate .java files from .jamon templates, we use the Jamon Plugin for Maven. Remembering that the Maven Compiler Plugin has an incremental-compile feature, I turned to its source code to find out how to implement this in the Jamon plugin. I was pleasantly surprised to find the StaleSourceScanner. This class allows you to easily compare two files to see if the source needs to re-examined for generation or compilation.

I noticed the Jamon Plugin had the following code to figure out which files it should generate into .java files:

private List<File> accumulateSources(File p_templateSourceDir)
{
  final List<File> result = new ArrayList<File>();
  if (p_templateSourceDir == null)
  {
    return result;
  }
  for (File f : p_templateSourceDir.listFiles())
  {
    if (f.isDirectory())
    {
      result.addAll(accumulateSources(f));
    }
    else if (f.getName().toLowerCase(Locale.US).endsWith(".jamon"))
    {
      String filePath = f.getPath();
       // FIXME !?
      String basePath = templateSourceDir().getAbsoluteFile().toString();
      result.add(new File(filePath.substring(basePath.length() + 1)));
    }
  }
  return result;
}

I changed it to be smarter and only generate changed templates with the following code:

private List<File> accumulateSources(File p_templateSourceDir) throws MojoExecutionException
{
  final List<File> result = new ArrayList<File>();
  if (p_templateSourceDir == null)
  {
    return result;
  }
  SourceInclusionScanner scanner = getSourceInclusionScanner( staleMillis );
  SourceMapping mapping = new SuffixMapping( ".jamon", ".java");

  scanner.addSourceMapping( mapping );

  final Set<File> staleFiles = new LinkedHashSet<File>();

  for (File f : p_templateSourceDir.listFiles())
  {
    if (!f.isDirectory())
    {
      continue;
    }

    try
    {
      staleFiles.addAll( scanner.getIncludedSources(f.getParentFile(), templateOutputDir()));
    }
    catch ( InclusionScanException e )
    {
      throw new MojoExecutionException(
        "Error scanning source root: \'" + p_templateSourceDir.getPath()
          + "\' " + "for stale files to recompile.", e );
    }
  }

  // Trim root path from file paths
  for (File file : staleFiles) {
    String filePath = file.getPath();
    String basePath = templateSourceDir().getAbsoluteFile().toString();
    result.add(new File(filePath.substring(basePath.length() + 1)));
  }
}

This method references a getSourceInclusionScanner() method, which is implemented as follows:

protected SourceInclusionScanner getSourceInclusionScanner( int staleMillis )
{
  SourceInclusionScanner scanner;

  if ( includes.isEmpty() && excludes.isEmpty() )
  {
      scanner = new StaleSourceScanner( staleMillis );
  }
  else
  {
      if ( includes.isEmpty() )
      {
          includes.add( "**/*.jamon" );
      }
      scanner = new StaleSourceScanner( staleMillis, includes, excludes );
  }

  return scanner;
}

If you're using Jamon and its Maven Plugin, you can view my patch at SourceForge. If you're looking to include this functionality in your project, I invite you to look at the code I learned from in the Maven Compiler's AbstractCompilerMojo class.

After making this change, I was able to reduce the build execution time by over 50%. Now it takes 20 seconds to hit the Jetty plugin and 42 seconds to finishing starting. Of course, in an ideal world, I'd like to get this down to 20 seconds or less. Strangely enough, the easiest way to do this seems to be simple: use Linux.

On the Linux desktop they provided me, it takes 12 seconds to hit the Jetty plugin and 23 seconds to finish starting. I'd like to think this is a hardware thing, but it only get 20% faster on OS X when using an 8GB RAM + SSD machine (vs. a 4GB + 5400 drive). Since Overstock has provided me with a 4GB MacBook Pro, I'm considering installing Ubuntu on it, just to see what the difference is.

Sun over the Snowbird In related news, Overstock.com is looking to hire a whole bunch of Java Developers this year. The pictures of the new Provo office look pretty sweet. Of course, you can also work at HQ, which is a mere 25 minutes from some of the best skiing in the world. Personally, I think Colorado's powder is better, but I can't argue with the convenience of no traffic. In addition to full-time gigs, they've started hiring more remote contractors like myself, so they pretty much have something for everyone. So if you love Java, like to get some turns in before work, and aren't an asshole - you should and I'll try to hook you up.

Update: After writing this post, I received an email from Neil Hartner questioning these numbers. Basically, he was able to get his MacBook Pro to run just as fast as Linux. Turns out, the reason my Mac was so much slower was because JRebel was configured in my MAVEN_OPTS. JRebel's FAQ does state the following:

Does JRebel make the server start up slower?
JRebel needs to do more work on startup (search more places for classes and resources, instrument classes, etc), so some slowdown can be expected. If it's larger than 50% please contact [email protected].

Since it's right around 50% slower, I guess there's no reason to call them. My guess is the best thing to do is remove JRebel from MAVEN_OPTS, but have an alias that can enable it, or simply run it from your IDE.

Posted in Java at Jan 21 2011, 03:26:43 PM MST 7 Comments

AppFuse 2.1 Milestone 2 Released

I'm pleased to announce the 2nd milestone release of AppFuse 2.1. This release includes upgrades to all dependencies to bring them up-to-date with their latest releases. Most notable are Spring 3 and Struts 2.1. This release fixes many issues with archetypes and contains many improvements to support Maven 3. For more details on specific changes see the 2.1.0 M2 release notes.

What is AppFuse?
AppFuse is an open source project and application that uses open source frameworks to help you develop Web applications quickly and efficiently. It was originally developed to eliminate the ramp-up time when building new web applications. At its core, AppFuse is a project skeleton, similar to the one that's created by your IDE when you click through a wizard to create a new web project. If you use JRebel with AppFuse, you can achieve zero-turnaround in your project and develop features without restarting the server.

Release Details
Archetypes now include all the source for the web modules so using jetty:run and your IDE will work much smoother now. The backend is still embedded in JARs, enabling you to choose with persistence framework (Hibernate, iBATIS or JPA) you'd like to use. If you want to modify the source for that, add the core classes to your project or run "appfuse:full-source".

AppFuse comes in a number of different flavors. It offers "light", "basic" and "modular" and archetypes. Light archetypes use an embedded H2 database and contain a simple CRUD example. In the final 2.1.0 release, the light archetypes will allow code generation like the basic and modular archetypes. Basic archetypes have web services using CXF, authentication from Spring Security and features including signup, login, file upload and CSS theming. Modular archetypes are similar to basic archetypes, except they have multiple modules which allows you to separate your services from your web project.

AppFuse provides archetypes for JSF, Spring MVC, Struts 2 and Tapestry 5. The light archetypes are available for these frameworks, as well as for Spring MVC + FreeMarker, Stripes and Wicket.

Please note that this release does not contain updates to the documentation. Code generation will work, but it's likely that some content in the tutorials won't match. For example, you can use annotations (vs. XML) for Spring MVC and Tapestry is a whole new framework. I'll be working on documentation over the next several weeks in preparation for the 2.1 final release.

For information on creating a new project, please see the QuickStart Guide.

If you have questions about AppFuse, please read the FAQ or join the user mailing list. If you find bugs, please create an issue in JIRA.

Thanks to everyone for their help contributing patches, writing documentation and participating on the mailing lists.

Posted in Java at Nov 15 2010, 03:28:57 PM MST 2 Comments

Versioning Static Assets with UrlRewriteFilter

A few weeks ago, a co-worker sent me interesting email after talking with the Zoompf CEO at JSConf.

One interesting tip mentioned was how we querystring the version on our scripts and css. Apparently this doesn't always cache the way we expected it would (some proxies will never cache an asset if it has a querystring). The recommendation is to rev the filename itself.

This article explains how we implemented a "cache busting" system in our application with Maven and the UrlRewriteFilter. We originally used querystring in our implementation, but switched to filenames after reading Souders' recommendation. That part was figured out by my esteemed colleague Noah Paci.

Our Requirements

  • Make the URL include a version number for each static asset URL (JS, CSS and SWF) that serves to expire a client's cache of the asset.
  • Insert the version number into the application so the version number can be included in the URL.
  • Use a random version number when in development mode (based on running without a packaged war) so that developers will not need to clear their browser cache when making changes to static resources. The random version number should match the production version number formats which is currently: x.y-SNAPSHOT-revisionNumber
  • When running in production, the version number/cachebust is computed once (when a Filter is initialized). In development, a new cachebust is computed on each request.

In our app, we're using Maven, Spring and JSP, but the latter two don't really matter for the purposes of this discussion.

Implementation Steps
1. First we added the buildnumber-maven-plugin to our project's pom.xml so the build number is calculated from SVN.

<plugin>
    <groupId>org.codehaus.mojo</groupId>
    <artifactId>buildnumber-maven-plugin</artifactId>
    <version>1.0-beta-4</version>
    <executions>
        <execution>
            <phase>validate</phase>
            <goals>
                <goal>create</goal>
            </goals>
        </execution>
    </executions>
    <configuration>
        <doCheck>false</doCheck>
        <doUpdate>false</doUpdate>
        <providerImplementations>
            <svn>javasvn</svn>
        </providerImplementations>
    </configuration>
</plugin>

2. Next we used the maven-war-plugin to add these values to our WAR's MANIFEST.MF file.

<plugin>
    <artifactId>maven-war-plugin</artifactId>
    <version>2.0.2</version>
    <configuration>
        <archive>
            <manifest>
                <addDefaultImplementationEntries>true</addDefaultImplementationEntries>
            </manifest>
            <manifestEntries>
                <Implementation-Version>${project.version}</Implementation-Version>
                <Implementation-Build>${buildNumber}</Implementation-Build>
                <Implementation-Timestamp>${timestamp}</Implementation-Timestamp>
            </manifestEntries>
        </archive>
    </configuration>
</plugin>

3. Then we configured a Filter to read the values from this file on startup. If this file doesn't exist, a default version number of "1.0-SNAPSHOT-{random}" is used. Otherwise, the version is calculated as ${project.version}-${buildNumber}.

private String buildNumber = null;

...
@Override
public void initFilterBean() throws ServletException {
    try {
        InputStream is = 
            servletContext.getResourceAsStream("/META-INF/MANIFEST.MF");
        if (is == null) {
            log.warn("META-INF/MANIFEST.MF not found.");
        } else {
            Manifest mf = new Manifest();
            mf.read(is);
            Attributes atts = mf.getMainAttributes();
            buildNumber = atts.getValue("Implementation-Version") + "-" + atts.getValue("Implementation-Build");
            log.info("Application version set to: " + buildNumber);
        }
     } catch (IOException e) {
        log.error("I/O Exception reading manifest: " + e.getMessage());
     }
}

...

    // If there was a build number defined in the war, then use it for
    // the cache buster. Otherwise, assume we are in development mode 
    // and use a random cache buster so developers don't have to clear 
    // their browswer cache.
    requestVars.put("cachebust", buildNumber != null ? buildNumber : "1.0-SNAPSHOT-" + new Random().nextInt(100000));

4. We then used the "cachebust" variable and appended it to static asset URLs as indicated below.

<c:set var="version" scope="request" 
    value="${requestScope.requestConfig.cachebust}"/>
<c:set var="base" scope="request"
    value="${pageContext.request.contextPath}"/>

<link rel="stylesheet" type="text/css" 
    href="${base}/v/${version}/assets/css/style.css" media="all"/>

<script type="text/javascript" 
    src="${base}/v/${version}/compressed/jq.js"></script>

The injection of /v/[CACHEBUSTINGSTRING]/(assets|compressed) eventually has to map back to the actual asset (that does not include the two first elements of the URI). The application must remove these two elements to map back to the actual asset. To do this, we use the UrlRewriteFilter. The UrlRewriteFilter is used (instead of Apache's mod_rewrite) so when developers run locally (using mvn jetty:run) they don't have to configure Apache.

5. In our application, "/compressed/" is mapped to wro4j's WroFilter. In order to get UrlRewriteFilter and WroFilter to work with this setup, the WroFilter has to accept FORWARD and REQUEST dispatchers.

<filter-mapping>
    <filter-name>rewriteFilter</filter-name>
    <url-pattern>/*</url-pattern>
</filter-mapping>

<filter-mapping>
    <filter-name>WebResourceOptimizer</filter-name>
    <url-pattern>/compressed/*</url-pattern>
    <dispatcher>FORWARD</dispatcher>
    <dispatcher>REQUEST</dispatcher>
</filter-mapping>

Once this was configured, we added the following rules to our urlrewrite.xml to allow rewriting of any assets or compressed resource request back to its "correct" URL.

<rule match-type="regex">
    <from>^/v/[0-9A-Za-z_.\-]+/assets/(.*)$</from>
    <to>/assets/$1</to>
</rule>
<rule match-type="regex">
    <from>^/v/[0-9A-Za-z_.\-]+/compressed/(.*)$</from>
    <to>/compressed/$1</to>
</rule>
<rule>
    <from>/compressed/**</from>
    <to>/compressed/$1</to>
</rule>

Of course, you can also do this in Apache. This is what it might look like in your vhost.d file:

RewriteEngine    on
RewriteLogLevel  0!
RewriteLog       /srv/log/apache22/app_rewrite_log
RewriteRule      ^/v/[.A-Za-z0-9_-]+/assets/(.*) /assets/$1 [PT]
RewriteRule      ^/v/[.A-Za-z0-9_-]+/compressed/(.*) /compressed/$1 [PT]

Whether it's a good idea to implement this in Apache or using the UrlRewriteFilter is up for debate. If we're able to do this with the UrlRewriteFilter, the benefit of doing this at all in Apache is questionable, especially since it creates a duplicate of code.

Posted in Java at Jun 04 2010, 09:27:42 AM MDT 4 Comments

What's New in Maven 3.0 with Matthew McCullough

Last night, I attended the Denver JUG meeting to hear some excellent talks by Matthew McCullough and Tim Berglund. I took notes during Matthew's talk, but my battery ran out before Tim's talk started. Below are my notes.

Matthew started out by described the differences between Maven 2 and Maven 3. As he began, he emphasized it wasn't a beginner talk, but mostly for existing Maven users that understand how to read a pom.xml and such.

The Roadmap
Commits to Maven 3 have been happening for the last 3 years. Matthew is not an employee of Sonatype, but he mentioned their name quite a bit in his talk. Sonatype has hired several committers (7 that Matthew knows of by name) that now work on Maven 3 full-time. For compatibility with Maven 2, the project has 450 integration tests and they test it against 100s of Maven 2 projects. Maven 3 has plugin classloader partitioning and a legacy simulation layer for old plugins.

The main improvement in Maven 3 is speed. It's been performance tuned to be 50% to 400% faster. Benchmarks (guaranteed by integration tests) include better: Disk I/O, Network I/O, CPU and Memory. Another new feature is extensibility so Maven is a better library rather than just a command-line tool. Now there's a library and APIs that you can use to do the things that Maven does. Plexus has been replaced with Guice and it's now much easier to embed Maven (Polyglot Maven and Maven Shell are examples of this).

Below are a number of other changes between Maven 2 and Maven 3.

  • Syntax: pom.xml still uses <modelVersion>4.0.0</modelVersion> so it can be a drop-in replacement for Maven 2 projects.
  • Validations: poms are heavily validated against common mistakes, warns when plugin versions are not specified (use mvn validate to see issues), blocks duplicate dependencies (examined in same POM only, conflict resolution used otherwise).
  • Help URLs: wiki page URLs now shown for all error messages. One of the first Apache projects to do this.
  • Removals: profiles.xml external file support removed, Maven 1.0 repository support removed <layout>legacy</layout> (it's been 5 years since any commits to Maven 1).
  • Behavior: SNAPSHOTs always deployed with date-stamps, artifact resolution caching has been improved to do less checking (override with mvn <phase> -U).
  • Plugins: version auto-selection favors RELEASEs over SNAPSHOTs (opposite for Maven 2), versions cannot be specified as RELEASE or LATEST, plugins only resolved from <pluginRepository> locations.
  • See the Plugin Compatibility Matrix to see if your favorite plugins are compatible.

Maven 3 hopes to be a drop-in replacement for Maven 2, but non backwards-compatible changes will be happening in Maven 3.1. It's anticipated release is Q1 of 2011 and will likely contain the following features.

  • "Mixins" for direct dependencies
  • Site plugin takes over <reporting>
  • Backwards compatibility by <modelVersion
  • There's a good chance 3.1 breaks compatibility with legacy POMs

Another new thing in Maven 3 is Toolchain. Toolchain a common way to configure your JDK for multiple plugins. There are only a handful of plugins that are toolchain-enabled. User tool chain definitions are defined in ~/.m2/toolchains.xml. To use different toolchains (JDKs), you specify a vendor and version as part of your plugin configuration.

Maven Shell is a high performance console that's a Maven 3 add-on. It's hosted at GitHub to make community contributions easier. It goes on your command line and it offers syntax highlighting and context-sensitive help (by typing ? at the command prompt).

Another major improvement in Maven 3 is Polyglot Maven. Tools like Gant and Buildr have made Maven look ancient, but they've also given it a good challenge. Maven 3 is likely to leapfrog these tools because of its ability to use different languages for your build configuration. Currently, 6 languages are supported. Polyglot Maven is a super-set distribution of Maven 3. It's not shipped with Maven 3 core because it contains all the other language implementations and is quite large. Polyglot Maven also contains a translate tool that allows you to convert any-to-any language. It has a DSL framework with Macros and Lifecycle Hooks. Macros allows for more concise syntax.

After talking about Polyglot Maven a bit, Matthew shows us a demo translating pom.xml to pom.yaml and then running the build. After that, he showed us examples of what a pom looks like when defined in Clojure, Scala and Groovy. Someone asked about file parsing performance and Matthew said different languages would cause a single-digit performance difference as part of your build process. Personally, I can't help but think any non-XML parser would be faster than the XML parser.

In regards to m2eclipse, a new drop (0.10) occurred a few weeks ago and it's one of the highest quality releases to date. It has major refactoring and many performance improvements.

For sample Maven projects see Matthew's Maven Samples.

I very much enjoyed Matthew's talk, both because of his presentation techniques and because he had a lot of good information. While I've tried Maven 3 and Shell in the past, I've been newly inspired to start using them again on a daily basis.

Tim's talk on Decision Making was also excellent. The biggest things I learned were that conflict is good (idea-wise, not personal) and things to look out for between teams (fault lines). Hopefully both Tim and Matthew post their slides so I can link to them here.

Posted in Java at May 13 2010, 03:54:21 PM MDT 1 Comment

Software Quality: The Quest for the Holy Grail?

This afternoon, I attended a session on software quality by Jesper Pedersen at TSSJS. Jesper is a Core Developer at JBoss by Red Hat. He's the project lead of JBoss JCA, Tattletale, Papaki and JBoss Profiler 2. He's also currently the chairman of the Boston JBoss User Group. In this session, Jesper hopes to define basic requirements for a development environment and offer ideas on how to clean up a messy project.

Is software quality a friend or a foe? By implementing software quality processes, are you introducing unnecessary overhead? Development platforms are different today. We write a lot more business-specific code. We depend on standard frameworks for the core functionality. We depend on vendors to implement the standards correctly. We also depend on vendors to provide the necessary integration layer.

Since the platform is now a large slice of the pie, we must make sure we know where the issue is located. We must have proper integration tests; we must manage dependencies. Today, we must treat dependencies as if they are part of the application.

Defining the platform for your project helps narrow down the dependencies for your project. The platform is composed of corporate guidelines, standards, vendors and backend systems that you have to integrate with. Documentation is key for a successful project. Key documents types: User Guide, Developer Guide, API Guide, Architect Design, Implementation and Test.

It helps to define a project-wide configuration management system. Define a code-formatting guide will add consistency in your source tree. Also make sure you have separate build, test and execution environments. Use a Maven repository for your dependencies; both to support your project's artifacts as well as vendor artifacts.

"Maven today is an industry standard." -- Jesper Pederson

Define your tool chain as you would for your application. Back your Maven repository with SCM tools like Subversion or Git. For testing, use JUnit (unit testing), Corbertura (test coverage) and Hudson (continuous integration). Furthermore, you can add Checkstyle and Findbugs to verify coding conventions and find basic issues with code.

For the build environment, you need to make sure your dependency metadata is correct. Also, make sure you use the best practices for your given build tool. For example, with Maven and Ivy, it's a good idea to extract the version numbers into a common area of your pom.xml/build.xml so you can easily view all the versions in use. After you've done this, you can externalize the version information from the environment. Watch out for transitive dependencies as they can be tricky. Make sure you know what gets pulled in. Use enforcers to approve/ban dependencies or turn it off (Ivy). You can also vote for MNG-2315. Finally, snapshot dependencies are evil: define your release process so that releases are easy.

What can you do if your project is already a mess? Signs that your project is a mess: you look at your platform as a big black box, you use different dependencies than your deployment platform or you don't have integration tests for sub-systems or dependencies. To fix this, you can use a tool to get an overview of the entire project. Enter Tattletale.

Tattletale can give you an overview of your dependencies (Ant and Maven integration). It's a static analysis tool that doesn't depend on metadata, scanning your class files instead. Using Tattletale, you can produce a number of reports about your dependencies, what they're dependent on and what's dependent on you.

To maintain the lead in your project, make sure to define a checklist for each stage of your development cycle. Do reviews on documentation, architecture, component design and code. Enforce your rules of your project with your build system.

Jesper's final thoughts:

  • Maintaining dependencies for a software project can be a tricky task.
  • Using an Open Source platform as the foundation will ease the investigation of issues and increase trust.
  • Defining a project-wide tool chain is key.
  • Enforce all the rules on the project (better up-front than "fixing it" afterwards)

As Dusty mentioned, this session has a lot of good (basic) information, but there wasn't much new stuff. My team is using many of the technologies and practices that Jesper has mentioned. I guess that's validation that we're doing it right. I've heard of Tattletale, but never had a need for it since I haven't been on any "messy" projects recently.

Posted in Java at Mar 17 2010, 03:00:46 PM MDT 2 Comments

AppFuse 2.1 Milestone 1 Released

The AppFuse Team is pleased to announce the first milestone release of AppFuse 2.1. This release includes upgrades to all dependencies to bring them up-to-date with their latest releases. Most notable are Hibernate, Spring and Tapestry 5.

What is AppFuse?
AppFuse is an open source project and application that uses open source tools built on the Java platform to help you develop Web applications quickly and efficiently. It was originally developed to eliminate the ramp-up time found when building new web applications for customers. At its core, AppFuse is a project skeleton, similar to the one that's created by your IDE when you click through a wizard to create a new web project.

Release Details
Archetypes now include all the source for the web modules so using jetty:run and your IDE will work much smoother now. The backend is still embedded in JARs, enabling you to choose which persistence framework (Hibernate, iBATIS or JPA) you'd like to use. If you want to modify the source for that, add the core classes to your project or run appfuse:full-source.

In addition, AppFuse Light has been converted to Maven and has archetypes available. AppFuse provides archetypes for JSF, Spring MVC, Struts 2 and Tapestry 5. The light archetypes are available for these frameworks, as well as for Spring MVC + FreeMarker, Stripes and Wicket.

Other notable improvements:

Please note that this release does not contain updates to the documentation. Code generation will work, but it's likely that some content in the tutorials won't match. For example, you can use annotations (vs. XML) for dependency injection and Tapestry is a whole new framework. I'll be working on documentation over the next several weeks in preparation for Milestone 2.

AppFuse is available as several Maven archetypes. For information on creating a new project, please see the QuickStart Guide.

To learn more about AppFuse, please read Ryan Withers' Igniting your applications with AppFuse.

The 2.x series of AppFuse has a minimum requirement of the following specification versions:

  • Java Servlet 2.4 and JSP 2.0 (2.1 for JSF)
  • Java 5+

If you have questions about AppFuse, please read the FAQ or join the user mailing list. If you find bugs, please create an issue in JIRA.

Thanks to everyone for their help contributing code, writing documentation, posting to the mailing lists, and logging issues.

Posted in Java at Nov 19 2009, 07:16:36 AM MST 8 Comments

Testing GWT Libraries with Selenium and Maven

On Tuesday, I wrote about Running Hosted Mode in GWT Libraries. Today I added an additional module to our project to run Selenium tests against our GWT library. In the process, I discovered some things I needed to modify in my GWT library's pom.xml. I'm writing this post so others can use this setup to write GWT libraries and package them for testing with Selenium.

First of all, I noticed that when you're using the GWT Maven Plugin with a JAR project, it doesn't automatically run gwt:compile or gwt:test in the compile and test phases. I had to explicitly configure the compile goal to run in the compile phase. I also had to add <webappDirectory> to the configuration to compile the JavaScript files into the war directory.

<plugin>
    <groupId>org.codehaus.mojo</groupId>
    <artifactId>gwt-maven-plugin</artifactId>
    <version>1.1</version>
    <configuration>
        <module>org.appfuse.gwt.core.CoreUI</module>
        <runTarget>index.html</runTarget>
        <webappDirectory>war</webappDirectory>
    </configuration>
    <executions>
        <execution>
            <phase>compile</phase>
            <goals>
                <goal>compile</goal>
            </goals>
        </execution>
    </executions>
</plugin>

To package the generated JavaScript and index.html in the JAR, I added the following <resources> section to the maven-resources-plugin configuration I mentioned in my previous post.

<resource>
    <directory>war</directory>
    <includes>
        <include>core.ui/**</include>
        <include>index.html</include>
    </includes>
</resource>

In addition, I discovered some javax.servlet.* classes in my JAR after running "mvn package". I believe this is caused by the GWT plugin sucking these in when it compiles my ProxyServlet. I excluded them by adding the maven-jar-plugin.

<plugin>
    <artifactId>maven-jar-plugin</artifactId>
    <configuration>
        <excludes>
            <exclude>javax/servlet/**</exclude>
        </excludes>
    </configuration>
</plugin>

After doing this, I was able to publish my JAR with all the contents I needed to run Selenium tests against it.

Testing the GWT Library with Selenium
The module that contains the Selenium tests is a WAR project that uses war overlays, Cargo and Selenium RC. You can read about the Maven setup I use for running Selenium tests in Packaging a SOFEA Application for Distribution.

The major difference when testing a JAR (vs. a WAR), is I had to use the maven-dependency-plugin to unpack the JAR so its contents would get included in the WAR for testing. Below is the configuration I used to accomplish this:

<plugin>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>generate-sources</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>org.appfuse</groupId>
                        <artifactId>gwt-core</artifactId>
                        <version>1.0-SNAPSHOT</version>
                        <type>jar</type>
                        <overWrite>false</overWrite>
                        <excludes>META-INF/**,org/**,javax/**</excludes>
                    </artifactItem>
                </artifactItems>
                <outputDirectory>
                    ${project.build.directory}/${project.build.finalName}
                </outputDirectory>
            </configuration>
        </execution>
    </executions>
</plugin>

Hopefully this will help you develop GWT libraries and run Selenium tests against them. If you have any suggestions for simplifying this configuration, please let me know.

NOTE: I did considering a couple of other options for running Selenium tests against our GWT library:

  1. Add something to the existing project that 1) creates a WAR and 2) fires up Cargo/Selenium in a profile to test it.
  2. Create the tests in a GWT (war) project that includes widgets from the library.

I decided on the solution documented above because it seemed like the best option.

Posted in Java at Nov 04 2009, 10:09:27 PM MST 2 Comments

Running Hosted Mode in GWT Libraries (when using Maven)

Earlier this year, I wrote about Modularizing GWT Applications with GWT-Maven. Fast forward 8 months and I'm still working with GWT and using this same technique. However, this time I'm working with the Maven GWT Plugin from Codehaus. In my last post, I wrote:

The results of modularizing your application are beneficial (shared code) and detrimental (you have to mvn install gwt-core whenever you make changes in shared classes). If you know of a way to configure the gwt-maven plugin to read sources from both gwt-core and gwt-webapp in hosted mode, I'd love to hear about it.

The good news is I found a solution for this, using the Builder Helper Maven Plugin. The GWT Maven Plugin's Productivity tip for multi-project setup has more information on how to configure this (note: we use IntelliJ and Eclipse on my project and did not need to configure this in a profile).

All was fine and dandy with this configuration until I wanted to be able to run hosted mode to develop/test everything in my library before including it in my main project. Luckily, you can still run mvn gwt:run on a JAR project. However, when you configure your pom.xml so sources are included in your JAR, you run into an issue: your *.java files will be copied to war/WEB-INF/classes and hosted mode will use these files as source rather than the ones you're editing in src/main/java.

To solve this, I changed my pom.xml to do two things:

  • Only copy resources right before packaging (in the test phase).
  • When packaging is complete, delete the *.java files from war/WEB-INF/classes (using Ant).

Below is the XML I used to make this possible. Please let me know if you have a way to simplify this configuration.

<plugin>
    <artifactId>maven-resources-plugin</artifactId>
    <version>2.4.1</version>
    <executions>
        <execution>
            <phase>test</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${project.build.outputDirectory}</outputDirectory>
                <resources>
                    <resource>
                        <directory>src/main/java</directory>
                    </resource>
                    <resource>
                        <directory>src/main/resources</directory>
                    </resource>
                </resources>
            </configuration>
        </execution>
    </executions>
</plugin>
<plugin>
    <artifactId>maven-antrun-plugin</artifactId>
    <version>1.3</version>
    <executions>
        <execution>
            <phase>package</phase>
            <goals>
                <goal>run</goal>
            </goals>
            <configuration>
                <tasks>
                    <delete>
                        <fileset dir="${project.build.outputDirectory}" includes="**/*.java"/>
                    </delete>
                </tasks>
            </configuration>
        </execution>
    </executions>
</plugin>

This solution seems to work pretty well. As far as developing your library in hosted mode, you'll need to configure two *.gwt.xml files, one that doesn't have an <entry-point> defined and one that does. Configure the one with the entry point as the <module> in your gwt-maven-plugin configuration.

As a side note, I found a few issues with the 1.1 version of the Maven GWT Archetype. Below are the steps I used to fix these issues and upgrade to GWT 1.7.0 (I realize 1.7.1 is out, but gwt-dev-1.7.1-mac.jar doesn't exist in Maven central).

First, create a new project by running the following from the command line:

mvn archetype:generate \
  -DarchetypeGroupId=org.codehaus.mojo \
  -DarchetypeArtifactId=gwt-maven-plugin \
  -DarchetypeVersion=1.1 \
  -DgroupId=com.yourcompany \
  -DartifactId=gwt-project -Dversion=1.0-SNAPSHOT -B

After creating the project, you'll need to modify the pom.xml as follows:

  1. Change the gwt-maven-plugin's version to 1.1.
  2. Change the ${gwtVersion} property to 1.7.0.
  3. Add <runTarget>Application.html</runTarget> to the <configuration> element of the plugin.
  4. Move Application.html and web.xml so they're under the "war" directory.
  5. Update Application.html to prepend the GWT module name in the <script> tag.

I hope these instructions help you create modular GWT projects with Maven. This setup is working great on my current project.

Posted in Java at Nov 03 2009, 09:37:07 AM MST 3 Comments

Packaging a SOFEA Application for Distribution

The project I'm working on is a bit different from those I'm used to. I'm used to working on web applications that are hosted on servers and customers access with their browser. SaaS if you will. My current client is different. They're a product company that sells applications and distributes them to customers via download and CD. Their customers install these applications on internal servers (supported servers include WebSphere, WebLogic and Tomcat).

The product I'm currently working on is structured as a SOFEA application and therefore consists of two separate modules - a backend and a frontend. Since it's installed in a servlet container, both modules are WARs and can be installed separately.

Building the backend and frontend as separate projects makes a lot of sense for two reasons:

  • In development, different teams can work on the frontend and backend projects.
  • Having them as separate projects allows them to be versioned separately.

However, having them as two separate projects does make it a bit more difficult for distribution. I'm writing this post to show you how I recently added support for distributing our application as 2 WARs or 1 WAR using the power of Maven, war overlays and the UrlRewriteFilter.

Project Setup
First of all, we have several different Maven modules, but the most important ones are as follows:

  • product-services
  • product-client
  • product-integration-tests

Of course, our modules aren't really named "product", but you get the point. The services project is really just a WAR project with Spring Security configured. It depends on other JAR modules that the services exist in. The client project is a GWT WAR that has a proxy servlet defined in its web.xml that makes it easier to develop. It also contains some UrlRewrite configuration that allows GWT Log's Remote Logging feature to work. The proxy servlet is something we don't want to ship with our product, so we have a separate web.xml for production vs. development. We do the substitution using the maven-war-plugin:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-war-plugin</artifactId>
    <version>2.0.2</version>
    <configuration>
        <!-- Production web.xml -->
        <webXml>src/main/resources/web.xml</webXml>
        <warSourceDirectory>war</warSourceDirectory>
        <!-- Exclude everything but urlrewrite JAR -->
        <warSourceExcludes>
            WEB-INF/lib/aop**,WEB-INF/lib/commons-**,WEB-INF/lib/gin-**,
            WEB-INF/lib/guice-**,WEB-INF/lib/gwt-**,WEB-INF/lib/gxt-**,
            WEB-INF/lib/junit-**
        </warSourceExcludes>
    </configuration>
</plugin>

I could exclude WEB-INF/lib/** and WEB-INF/classes/**, but in my particular project, we still want UrlRewrite in standalone mode, and we have some i18n properties files in WEB-INF/classes that are served up for Selenium tests.

With this configuration, we have a services WAR and a client WAR that can be installed and used by clients. To collapse them into one and make it possible to ship a single war, I turned to our product-integration-tests module. This module contains Selenium tests that test both types of distributions.

Merging 2 WARs into 1
The most important thing in the product-integration-tests module is that it creates a single WAR. First of all, it uses <packaging>war</packaging> to make this possible. The rest is done using the following 3 steps.

1. Its dependencies include the client and servlet WARs (and Selenium RC for testing).

<dependencies>
    <dependency>
        <groupId>com.company.app</groupId>
        <artifactId>product-services</artifactId>
        <version>1.0-SNAPSHOT</version>
        <type>war</type>
    </dependency>
    <dependency>
        <groupId>com.company.app</groupId>
        <artifactId>product-client</artifactId>
        <version>1.0-SNAPSHOT</version>
        <type>war</type>
    </dependency>
    <dependency>
        <groupId>org.seleniumhq.selenium.client-drivers</groupId>
        <artifactId>selenium-java-client-driver</artifactId>
        <version>1.0.1</version>
        <scope>test</scope>
    </dependency>
</dependencies>

2. The WAR created excludes the "integration-tests" part of the name:

<build>
    <finalName>product-${project.version}</finalName>
    ...
</build>

3. WAR overlays are configured so the everything in the client's WEB-INF directory is excluded from the merged WAR.

<plugin>
    <artifactId>maven-war-plugin</artifactId>
    <configuration>
        <!-- http://maven.apache.org/plugins/maven-war-plugin/overlays.html -->
        <overlays>
            <overlay>
                <groupId>com.company.app</groupId>
                <artifactId>product-services</artifactId>
                <excludes>
                    <!-- TODO: Rename to api.html (this is the Enunciate-generated documentation) -->
                    <exclude>index.html</exclude>
                </excludes>
            </overlay>
            <!-- No server needed in product-client -->
            <overlay>
                <groupId>com.company.app</groupId>
                <artifactId>product-client</artifactId>
                <excludes>
                    <exclude>WEB-INF/**</exclude>
                </excludes>
            </overlay>
            <!-- Only include META-INF/context.xml to set the ROOT path -->
            <overlay>
                <excludes>
                    <exclude>WEB-INF/**</exclude>
                </excludes>
            </overlay>
        </overlays>
    </configuration>
</plugin>

That's it! Using this configuration, it's possible to distribute a Maven-based SOFEA project as single or multiple WARs. However, there are some nuances.

One thing you might notice is the reference to META-INF/context.xml in the overlays configuration. This subtly highlights one issue I experienced when merging the WARs. In our GWT client, we're using URLs that point to our services at /product-services/*. This works in development (via a proxy servlet) and when the WARs are installed separately - as long as the services WAR is installed at /product-services. However, when they're merged, a little URL rewriting needs to happen. To do this, I added the UrlRewriteFilter to the product-services module and configured a simple rule.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCENGINE urlrewrite PUBLIC "-//tuckey.org//DTD UrlRewrite 3.0//EN"
        "http://tuckey.org/res/dtds/urlrewrite3.0.dtd">

<urlrewrite use-query-string="true">
    <!-- Used when services are merged into WAR with GWT client -->
    <rule>
        <from>^/product-services/(.*)$</from>
        <to type="forward">/$1</to>
    </rule>
</urlrewrite>

Because the services URLs point to the root (/product-services), the merged WAR has to be installed as the ROOT application. When you're using Cargo with Tomcat and want to deploy to ROOT, you have to have a META-INF/context.xml with a path="" reference (ref: CARGO-516).

<Context path=""/>

It is possible to change the URLs in the client to be relative, but this gets seems to get messy when you're using separate WARs. When using relative URLs, I found I had to do solution using cross-context forwarding to get the results I wanted. Using a redirect instead of a forward worked, but resulted in the client talking to the server twice (once to get redirected, a second time for the actual call). Cross-context forwarding is supported by the UrlRewriteFilter and Tomcat, but I'm not sure WebSphere or WebLogic support it. The best solution is probably to change the URLs dynamically at runtime, possibly using some sort of deferred binding technique.

Testing with Cargo and Selenium
Once I had everything merged, I wanted to configure Cargo and Selenium to allow testing both distribution types. If I installed all 3 wars at the same time, the "product-services" WAR would be used by both the product-client.war and the product.war, so I had to use profiles to allow installing the single merged WAR or both WARs. Below is the profile I used for starting Cargo, deploying the merged WAR, starting Selenium RC and running Selenium tests.

<properties>
    <cargo.container>tomcat6x</cargo.container>
    <cargo.container.url>
        http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.20/bin/apache-tomcat-6.0.20.zip
    </cargo.container.url>
    <cargo.host>localhost</cargo.host>
    <cargo.port>23433</cargo.port>
    <cargo.wait>false</cargo.wait>
    <cargo.version>1.0</cargo.version>

    <!-- *safari and *iexplore are additional options -->
    <selenium.browser>*firefox</selenium.browser>
</properties>
...
<profile>
    <id>itest-bamboo</id>
    <activation>
        <activeByDefault>false</activeByDefault>
    </activation>
    <build>
        <plugins>
            <plugin>
                <groupId>org.codehaus.cargo</groupId>
                <artifactId>cargo-maven2-plugin</artifactId>
                <version>${cargo.version}</version>
                <configuration>
                    <wait>${cargo.wait}</wait>
                    <container>
                        <containerId>${cargo.container}</containerId>
                        <log>${project.build.directory}/${cargo.container}/cargo.log</log>
                        <zipUrlInstaller>
                            <url>${cargo.container.url}</url>
                            <installDir>${installDir}</installDir>
                        </zipUrlInstaller>
                    </container>
                    <configuration>
                        <home>${project.build.directory}/${cargo.container}/container</home>
                        <properties>
                            <cargo.hostname>${cargo.host}</cargo.hostname>
                            <cargo.servlet.port>${cargo.port}</cargo.servlet.port>
                        </properties>
                        <!-- Deploy as ROOT since XHR requests are made to /product-services -->
                        <deployables>
                            <deployable>
                                <properties>
                                    <context>ROOT</context>
                                </properties>
                            </deployable>
                        </deployables>
                    </configuration>
                </configuration>
                <executions>
                    <execution>
                        <id>start-container</id>
                        <phase>pre-integration-test</phase>
                        <goals>
                            <goal>start</goal>
                        </goals>
                    </execution>
                    <execution>
                        <id>stop-container</id>
                        <phase>post-integration-test</phase>
                        <goals>
                            <goal>stop</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>selenium-maven-plugin</artifactId>
                <version>1.0</version>
                <executions>
                    <execution>
                        <phase>pre-integration-test</phase>
                        <goals>
                            <goal>start-server</goal>
                        </goals>
                        <configuration>
                            <background>true</background>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-surefire-plugin</artifactId>
                <executions>
                    <execution>
                        <phase>integration-test</phase>
                        <goals>
                            <goal>test</goal>
                        </goals>
                        <configuration>
                            <excludes>
                                <exclude>none</exclude>
                            </excludes>
                            <includes>
                                <include>**/*SeleniumTest.java</include>
                            </includes>
                            <systemProperties>
                                <property>
                                    <name>selenium.browser</name>
                                    <value>${selenium.browser}</value>
                                </property>
                                <property>
                                    <name>cargo.port</name>
                                    <value>${cargo.port}</value>
                                </property>
                            </systemProperties>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</profile>

This profile is run by our Bamboo nightly tests with mvn install -Pitest-bamboo. The 2nd profile I added doesn't install the project's WAR, but instead installs the two separate WARs. Running mvn install -Pitest-bamboo,multiple-wars executes the Selenium tests against the multi-WAR distribution.

<profile>
    <id>multiple-wars</id>
    <activation>
        <activeByDefault>false</activeByDefault>
    </activation>
    <build>
        <plugins>
            <plugin>
                <groupId>org.codehaus.cargo</groupId>
                <artifactId>cargo-maven2-plugin</artifactId>
                <version>${cargo.version}</version>
                <configuration>
                    <configuration>
                        <home>${project.build.directory}/${cargo.container}/container</home>
                        <properties>
                            <cargo.hostname>${cargo.host}</cargo.hostname>
                            <cargo.servlet.port>${cargo.port}</cargo.servlet.port>
                        </properties>
                        <deployables>
                            <deployable>
                                <groupId>com.company.app</groupId>
                                <artifactId>product-client</artifactId>
                                <pingURL>http://${cargo.host}:${cargo.port}/product-client/index.html</pingURL>
                                <type>war</type>
                                <properties>
                                    <context>/product-client</context>
                                </properties>
                            </deployable>
                            <deployable>
                                <groupId>com.company.app</groupId>
                                <artifactId>product-services</artifactId>
                                <pingURL>
                                    http://${cargo.host}:${cargo.port}/project-services/index.jspx
                                </pingURL>
                                <type>war</type>
                                <properties>
                                    <context>/product-services</context>
                                </properties>
                            </deployable>
                        </deployables>
                    </configuration>
                </configuration>
            </plugin>
        </plugins>
    </build>
</profile>

I won't be including any information on authoring Selenium tests because there's already many good references. I encourage you to checkout the following if you're looking for Selenium testing techniques.

Summary
This article has shown you how I used Maven, war overlays and the UrlRewriteFilter to allow create different distributions of a SOFEA application. I'm still not sure which packaging (1 WAR vs. 2) mechanism is best, but it's nice to know there's options. If you package and distribute SOFEA applications, I'd love to hear about your experience in this area.

Posted in Java at Oct 06 2009, 01:17:38 AM MDT 2 Comments