Matt RaibleMatt Raible is a writer with a passion for software. Connect with him on LinkedIn.

The Angular Mini-Book The Angular Mini-Book is a guide to getting started with Angular. You'll learn how to develop a bare-bones application, test it, and deploy it. Then you'll move on to adding Bootstrap, Angular Material, continuous integration, and authentication.

Spring Boot is a popular framework for building REST APIs. You'll learn how to integrate Angular with Spring Boot and use security best practices like HTTPS and a content security policy.

For book updates, follow @angular_book on Twitter.

The JHipster Mini-Book The JHipster Mini-Book is a guide to getting started with hip technologies today: Angular, Bootstrap, and Spring Boot. All of these frameworks are wrapped up in an easy-to-use project called JHipster.

This book shows you how to build an app with JHipster, and guides you through the plethora of tools, techniques and options you can use. Furthermore, it explains the UI and API building blocks so you understand the underpinnings of your great application.

For book updates, follow @jhipster-book on Twitter.

10+ YEARS


Over 10 years ago, I wrote my first blog post. Since then, I've authored books, had kids, traveled the world, found Trish and blogged about it all.
You searched this site for "young russian teenboy model pre teen". 788 entries found.

You can also try this same search on Google.

Testing GWT Libraries with Selenium and Maven

On Tuesday, I wrote about Running Hosted Mode in GWT Libraries. Today I added an additional module to our project to run Selenium tests against our GWT library. In the process, I discovered some things I needed to modify in my GWT library's pom.xml. I'm writing this post so others can use this setup to write GWT libraries and package them for testing with Selenium.

First of all, I noticed that when you're using the GWT Maven Plugin with a JAR project, it doesn't automatically run gwt:compile or gwt:test in the compile and test phases. I had to explicitly configure the compile goal to run in the compile phase. I also had to add <webappDirectory> to the configuration to compile the JavaScript files into the war directory.

<plugin>
    <groupId>org.codehaus.mojo</groupId>
    <artifactId>gwt-maven-plugin</artifactId>
    <version>1.1</version>
    <configuration>
        <module>org.appfuse.gwt.core.CoreUI</module>
        <runTarget>index.html</runTarget>
        <webappDirectory>war</webappDirectory>
    </configuration>
    <executions>
        <execution>
            <phase>compile</phase>
            <goals>
                <goal>compile</goal>
            </goals>
        </execution>
    </executions>
</plugin>

To package the generated JavaScript and index.html in the JAR, I added the following <resources> section to the maven-resources-plugin configuration I mentioned in my previous post.

<resource>
    <directory>war</directory>
    <includes>
        <include>core.ui/**</include>
        <include>index.html</include>
    </includes>
</resource>

In addition, I discovered some javax.servlet.* classes in my JAR after running "mvn package". I believe this is caused by the GWT plugin sucking these in when it compiles my ProxyServlet. I excluded them by adding the maven-jar-plugin.

<plugin>
    <artifactId>maven-jar-plugin</artifactId>
    <configuration>
        <excludes>
            <exclude>javax/servlet/**</exclude>
        </excludes>
    </configuration>
</plugin>

After doing this, I was able to publish my JAR with all the contents I needed to run Selenium tests against it.

Testing the GWT Library with Selenium
The module that contains the Selenium tests is a WAR project that uses war overlays, Cargo and Selenium RC. You can read about the Maven setup I use for running Selenium tests in Packaging a SOFEA Application for Distribution.

The major difference when testing a JAR (vs. a WAR), is I had to use the maven-dependency-plugin to unpack the JAR so its contents would get included in the WAR for testing. Below is the configuration I used to accomplish this:

<plugin>
    <artifactId>maven-dependency-plugin</artifactId>
    <executions>
        <execution>
            <id>unpack</id>
            <phase>generate-sources</phase>
            <goals>
                <goal>unpack</goal>
            </goals>
            <configuration>
                <artifactItems>
                    <artifactItem>
                        <groupId>org.appfuse</groupId>
                        <artifactId>gwt-core</artifactId>
                        <version>1.0-SNAPSHOT</version>
                        <type>jar</type>
                        <overWrite>false</overWrite>
                        <excludes>META-INF/**,org/**,javax/**</excludes>
                    </artifactItem>
                </artifactItems>
                <outputDirectory>
                    ${project.build.directory}/${project.build.finalName}
                </outputDirectory>
            </configuration>
        </execution>
    </executions>
</plugin>

Hopefully this will help you develop GWT libraries and run Selenium tests against them. If you have any suggestions for simplifying this configuration, please let me know.

NOTE: I did considering a couple of other options for running Selenium tests against our GWT library:

  1. Add something to the existing project that 1) creates a WAR and 2) fires up Cargo/Selenium in a profile to test it.
  2. Create the tests in a GWT (war) project that includes widgets from the library.

I decided on the solution documented above because it seemed like the best option.

Posted in Java at Nov 04 2009, 10:09:27 PM MST 2 Comments

A Letter to the AppFuse Community

The last AppFuse release was way back in May 2008. Many folks have asked when the next release would be ever since. Often, I've said "sometimes this quarter", but obviously, that's never happened. For that, I apologize.

There are many reasons I haven't worked on AppFuse for the past 18 months, but it mostly comes down to the fact that I didn't make time for it. The good news is I'm working on it again and will have a release out sometime this month. Unfortunately, it probably won't be a 2.1 final release, but there's so many things that've changed, I feel like a milestone release is a good idea. Here's a brief summary of changes so far:

  • Changed archetypes to include all source and tests for the "webapp" portion of the application. No more warpath plugin, merging wars and IDE issues. Using "mvn jetty:run" should work as expected.
  • Moved from Spring XML to Annotations.
  • AppFuse Light converted to Maven modules and now depends on AppFuse's backend.
  • Published easier to use archetype selection form in the QuickStart Guide.
  • Published archetype selection form for AppFuse Light. I do plan on combining these forms as soon as I figure out the best UI and instructions for users to choose AppFuse or AppFuse Light.
  • Upgraded all libraries to latest released versions (Spring 3 hasn't had a final release yet).
  • Upgraded to Tapestry 5 thanks to Serge Eby. I still need to complete tests and code generation for tests.
  • Added Compass support thanks to a patch from Shay Banon.
  • Upgraded from XFire to CXF for Web Services.
  • Moved Maven repository to Sonatype's OSS Repository Hosting for snapshots and releasing to Maven Central. There are no longer any AppFuse-specific artifacts, all are available in central.

I realize there's many full-stack frameworks that do the same thing as AppFuse with less code. Examples include Ruby on Rails, Grails, Seam, Spring Roo and the Play framework. However, there seems to be quite a few folks that continue to use AppFuse and it stills serves the community as a nice example of how to integrate frameworks. Furthermore, it helps me keep up with the latest framework releases, their quirks and issues that happen when you try to integrate them. In short, working on it helps me stay up to speed with Java open source frameworks.

For those folks that like the 1.x, Ant-based version of AppFuse, there will not be a 1.9.5 release. I know I promised it for years, but it's simply something I will not use, so I'd rather not invest my time in it. I'm sorry for lying to those that expected it.

So what's the future of AppFuse? Will it continue to integrate web frameworks with Spring and popular persistence frameworks? Possibly, but it seems more logical to align it with the types of Ajax + REST applications I'm creating these days. I'm currently thinking AppFuse 3.0 would be nice as a RESTful backend with GWT and Flex UIs. I might create the backend with CXF, but it's possible I'd use one of the frameworks mentioned above and simply leverage it to create the default features AppFuse users have come to expect.

More than anything, I'm writing this letter to let you know that the AppFuse project is not dead and you can expect a release in the near future.

Thanks for your support,

Matt

Posted in Java at Nov 04 2009, 12:17:17 AM MST 44 Comments

Running Hosted Mode in GWT Libraries (when using Maven)

Earlier this year, I wrote about Modularizing GWT Applications with GWT-Maven. Fast forward 8 months and I'm still working with GWT and using this same technique. However, this time I'm working with the Maven GWT Plugin from Codehaus. In my last post, I wrote:

The results of modularizing your application are beneficial (shared code) and detrimental (you have to mvn install gwt-core whenever you make changes in shared classes). If you know of a way to configure the gwt-maven plugin to read sources from both gwt-core and gwt-webapp in hosted mode, I'd love to hear about it.

The good news is I found a solution for this, using the Builder Helper Maven Plugin. The GWT Maven Plugin's Productivity tip for multi-project setup has more information on how to configure this (note: we use IntelliJ and Eclipse on my project and did not need to configure this in a profile).

All was fine and dandy with this configuration until I wanted to be able to run hosted mode to develop/test everything in my library before including it in my main project. Luckily, you can still run mvn gwt:run on a JAR project. However, when you configure your pom.xml so sources are included in your JAR, you run into an issue: your *.java files will be copied to war/WEB-INF/classes and hosted mode will use these files as source rather than the ones you're editing in src/main/java.

To solve this, I changed my pom.xml to do two things:

  • Only copy resources right before packaging (in the test phase).
  • When packaging is complete, delete the *.java files from war/WEB-INF/classes (using Ant).

Below is the XML I used to make this possible. Please let me know if you have a way to simplify this configuration.

<plugin>
    <artifactId>maven-resources-plugin</artifactId>
    <version>2.4.1</version>
    <executions>
        <execution>
            <phase>test</phase>
            <goals>
                <goal>copy-resources</goal>
            </goals>
            <configuration>
                <outputDirectory>${project.build.outputDirectory}</outputDirectory>
                <resources>
                    <resource>
                        <directory>src/main/java</directory>
                    </resource>
                    <resource>
                        <directory>src/main/resources</directory>
                    </resource>
                </resources>
            </configuration>
        </execution>
    </executions>
</plugin>
<plugin>
    <artifactId>maven-antrun-plugin</artifactId>
    <version>1.3</version>
    <executions>
        <execution>
            <phase>package</phase>
            <goals>
                <goal>run</goal>
            </goals>
            <configuration>
                <tasks>
                    <delete>
                        <fileset dir="${project.build.outputDirectory}" includes="**/*.java"/>
                    </delete>
                </tasks>
            </configuration>
        </execution>
    </executions>
</plugin>

This solution seems to work pretty well. As far as developing your library in hosted mode, you'll need to configure two *.gwt.xml files, one that doesn't have an <entry-point> defined and one that does. Configure the one with the entry point as the <module> in your gwt-maven-plugin configuration.

As a side note, I found a few issues with the 1.1 version of the Maven GWT Archetype. Below are the steps I used to fix these issues and upgrade to GWT 1.7.0 (I realize 1.7.1 is out, but gwt-dev-1.7.1-mac.jar doesn't exist in Maven central).

First, create a new project by running the following from the command line:

mvn archetype:generate \
  -DarchetypeGroupId=org.codehaus.mojo \
  -DarchetypeArtifactId=gwt-maven-plugin \
  -DarchetypeVersion=1.1 \
  -DgroupId=com.yourcompany \
  -DartifactId=gwt-project -Dversion=1.0-SNAPSHOT -B

After creating the project, you'll need to modify the pom.xml as follows:

  1. Change the gwt-maven-plugin's version to 1.1.
  2. Change the ${gwtVersion} property to 1.7.0.
  3. Add <runTarget>Application.html</runTarget> to the <configuration> element of the plugin.
  4. Move Application.html and web.xml so they're under the "war" directory.
  5. Update Application.html to prepend the GWT module name in the <script> tag.

I hope these instructions help you create modular GWT projects with Maven. This setup is working great on my current project.

Posted in Java at Nov 03 2009, 09:37:07 AM MST 3 Comments

Developing and Testing GWT Client Services

Earlier this week, Hiram Chirino released RestyGWT, a GWT generator for REST services and JSON encoded data transfer objects. You can read more about it in Hiram's post RestyGWT, a Better GWT RPC??. First of all, I'm impressed with RestyGWT because provides something I've always wanted with GWT: the ability to call RESTful services and get a populated POJO in your callback, much like AsyncCallback provides for RPC services.

RestyGWT also allows you to easily create services using only interfaces and JAX-RS annotations. For example:

import javax.ws.rs.POST;
...
public interface PizzaService extends RestService {
    @POST
    public void order(PizzaOrder request, MethodCallback<OrderConfirmation> callback);
}

After taking a brief look at RestyGWT, I thought it'd be interesting to share how I develop and test GWT client services.

Developing GWT Client Services
Writing services in a GWT application can be helpful when you're using MVP, especially since you can EasyMock them in a test. On my GWT projects, I've often used overlay types because they allow me to write less code and they make parsing JSON super simple. I've had issues testing my presenters when using overlay types. The good news is I think I've figured out a reasonable solution, but it does require using GWTTestCase. If RestyGWT supported overlay types, there's a good chance I'd use it, especially since its integration tests seem to require GWTTestCase too.

Rather than using callbacks in my presenters, I try to only use them in my service implementations. That way, my presenters don't have to worry about overlay types and can be tested in a JUnit-only fashion. The callbacks in my services handle JSON parsing/object population and fire events with the populated objects.

GWT's RequestBuilder is one option for communicating with RESTful services. The Development Guide for HTTP Requests explains how to use this class. To simplify REST requests and allow multiple callbacks, I'm using a RestRequest class, and a number of other utility classes that make up a small GWT REST framework (created by a former colleague). RestRequest wraps RequestBuilder and provides a Fluent API for executing HTTP requests. Another class, Deferred, is a GWT implementation of Twisted's Deferred.

As part of my service implementation, I inject an EventBus (with GIN) into the constructor and then proceed to implement callbacks that fire Events to indicate loading, saving and deleting has succeeded. Here's an example service:

public class ConversationServiceImpl implements ConversationService {
    private EventBus eventBus;

    @Inject
    public ConversationServiceImpl(EventBus eventBus) {
        this.eventBus = eventBus;
    }

    public void getConversation(String name) {
        Deferred<Representation> d =
                RestRequest.get(URLs.CONVERSATION + "/" + URL.encode(name)).build();

        d.addCallback(new Callback<Representation>() {
            public void onSuccess(Representation result) {
                Conversation conversation = convertResultToConversation(result);
                eventBus.fireEvent(new ResourceLoadedEvent<Conversation>(conversation));
            }
        });

        d.run();
    }

    public void saveConversation(Conversation conversation) {
        Deferred<Representation> d = RestRequest.post(URLs.CONVERSATION)
                .setRequestData(conversation.toJson()).build();
        
        d.addCallback(new Callback<Representation>() {
            public void onSuccess(Representation result) {
                Conversation conversation = convertResultToConversation(result);
                eventBus.fireEvent(new ResourceSavedEvent<Conversation>(conversation));
            }
        });

        d.run();
    }

    public void deleteConversation(Long id) {
        Deferred<Representation> d =
                RestRequest.post(URLs.CONVERSATION + "/" + id).build();

        d.addCallback(new Callback<Representation>() {
            public void onSuccess(Representation result) {
                eventBus.fireEvent(new ResourceDeletedEvent());
            }
        });

        d.run();
    }

    /**
     * Convenience method to populate object in one location
     *
     * @param result the result of a resource request.
     * @return the populated object.
     */
    private Conversation convertResultToConversation(Representation result) {
        JSOModel model = JSOModel.fromJson(result.getData());
        return new Conversation(model);
    }
}

In the saveConversation() method you'll notice the conversation.toJson() method call. This method uses a JSON class that loops through an objects properties and constructs a JSON String.

public JSON toJson() {
    return new JSON(getMap());
}

Testing Services
In my experience, the hardest part about using overlay types is writing your objects so they get populated correctly. I've found that writing tests which read JSON from a file can be a great productivity boost. However, because of overlay types, you have to write a test that extends GWTTestCase. When using GWTTestCase, you can't simply read from the filesystem. The good news is there is a workaround where you can subclass GWTShellServlet and overwrite GWT's web.xml to have your own servlet that can read from the filesystem. A detailed explanation of how to do this was written by Alex Moffat in Implementing a -noserver flag for GWTTestCase.

Once this class is in place, I've found you can easily write services using TDD and the server doesn't even have to exist. When constructing services, I've found the following workflow to be the most productive:

  1. Create a file with the expected JSON in src/test/resources/resource.json where resource matches the last part of the URL for your service.
  2. Create a *ServiceGwtTest.java and write tests.
  3. Run tests to make sure they fail.
  4. Implement the service and run tests to ensure JSON is getting consumed/produced properly to/from model objects.

Below is the code for my JsonReaderServlet.java:

public class JsonReaderServlet extends GWTShellServlet {

    public void service(ServletRequest servletRequest, ServletResponse servletResponse)
            throws ServletException, IOException {

        HttpServletRequest req = (HttpServletRequest) servletRequest;
        HttpServletResponse resp = (HttpServletResponse) servletResponse;

        String uri = req.getRequestURI();
        if (req.getQueryString() != null) {
            uri += "?" + req.getQueryString();
        }

        if (uri.contains("/services")) {
            String method = req.getMethod();
            String output;

            if (method.equalsIgnoreCase("get")) {
                // use the part after the last slash as the filename
                String filename = uri.substring(uri.lastIndexOf("/") + 1, uri.length()) + ".json";
                System.out.println("loading: " + filename);
                String json = readFileAsString("/" + filename);
                System.out.println("loaded json: " + json);
                output = json;
            } else {
                // for posts, return the same body content
                output = getBody(req);
            }

            PrintWriter out = resp.getWriter();
            out.write(output);
            out.close();

            resp.setStatus(HttpServletResponse.SC_OK);
        } else {
            super.service(servletRequest, servletResponse);
        }
    }

    private String readFileAsString(String filePath) throws IOException {
        filePath = getClass().getResource(filePath).getFile();
        BufferedReader reader = new BufferedReader(new FileReader(filePath));
        return getStringFromReader(reader);
    }

    private String getBody(ServletRequest request) throws IOException {
        BufferedReader reader = new BufferedReader(new InputStreamReader(request.getInputStream()));
        return getStringFromReader(reader);
    }

    private String getStringFromReader(Reader reader) throws IOException {
        StringBuilder sb = new StringBuilder();
        char[] buf = new char[1024];
        int numRead;
        while ((numRead = reader.read(buf)) != -1) {
            sb.append(buf, 0, numRead);
        }
        reader.close();
        return sb.toString();
    }
}

This servlet is mapped to <url-pattern>/*</url-pattern> in a web.xml file in src/test/resources/com/google/gwt/dev/etc/tomcat/webapps/ROOT/WEB-INF.

My Service Test starts by getting an EventBus from GIN and registering itself to handle the fired events.

public class ConversationServiceGwtTest extends AbstractGwtTestCase
        implements ResourceLoadedEvent.Handler, ResourceSavedEvent.Handler, ResourceDeletedEvent.Handler {
    ConversationService service;
    ResourceLoadedEvent<Conversation> loadedEvent;
    ResourceSavedEvent<Conversation> savedEvent;
    ResourceDeletedEvent deletedEvent;

    @Override
    public void gwtSetUp() throws Exception {
        super.gwtSetUp();
        DesigntimeGinjector injector = GWT.create(MyGinjector.class);
        EventBus eventBus = injector.getEventBus();
        service = new ConversationServiceImpl(eventBus);
        eventBus.addHandler(ResourceLoadedEvent.ENGINE, this);
        eventBus.addHandler(ResourceSavedEvent.ENGINE, this);
        eventBus.addHandler(ResourceDeletedEvent.ENGINE, this);
    }

    @SuppressWarnings("unchecked")
    public void onLoad(ResourceLoadedEvent event) {
        this.loadedEvent = event;
    }

    @SuppressWarnings("unchecked")
    public void onSave(ResourceSavedEvent event) {
        this.savedEvent = event;
    }

    public void onDelete(ResourceDeletedEvent event) {
        this.deletedEvent = event;
    }
}

After this groundwork has been done, a test can be written that loads up the JSON file and verifies the objects are populated correctly.

public void testGetConversation() {

    service.getConversation("test-conversation");

    Timer t = new Timer() {
        public void run() {
            assertNotNull("ResourceLoadedEvent not received", loadedEvent);
            Conversation conversation = loadedEvent.getResource();
            assertEquals("Conversation name is incorrect","Test Conversation", conversation.getName());

            assertNotNull("Conversation has no channel", conversation.getChannel());
            assertEquals("Conversation has incorrect task size", 3, conversation.getTasks().size());

            convertToAndFromJson(conversation);
            finishTest();
        }
    };

    delayTestFinish(3000);
    t.schedule(100);
}

private void convertToAndFromJson(Conversation fromJsonModel) {
    Representation json = fromJsonModel.toJson();
    assertNotNull("Cannot convert empty JSON", json.getData());

    // change back into model
    JSOModel data = JSOModel.fromJson(json.getData());
    Conversation toJsonModel = new Conversation(data);
    verifyModelBuiltCorrectly(toJsonModel);
}

private void verifyModelBuiltCorrectly(Conversation model) {
    assertEquals("Conversation name is incorrect", "Test Conversation", model.getString("name"));
    assertEquals("Conversation has incorrect task size", 3, model.getTasks().size());
    assertEquals("Conversation channel is incorrect", "Web", model.getChannel().getString("type"));
}

For more information on the usage of the Timer, finishTest() and delayTestFinish(), see GWTTestCase's javadoc.

The tests for saving and deleting a resource look as follows:

public void testSaveConversation() {
    Conversation conversation = new Conversation().setName("Test").setId("1");

    List<Task> tasks = new ArrayList<Task>();
    for (int i = 1; i < 4; i++) {
        tasks.add(new Task().setName("Task " + i));
    }
    conversation.setTasks(tasks);

    System.out.println("conversation.toJson(): " + conversation.toJson());

    assertTrue(conversation.toJson().toString().contains("Task 1"));

    service.saveConversation(conversation);

    Timer t = new Timer() {
        public void run() {
            assertNotNull("ResourceSavedEvent not received", savedEvent);
            finishTest();
        }
    };

    delayTestFinish(3000);
    t.schedule(100);
}
  
public void testDeleteConversation() {
    service.deleteConversation(1L);

    Timer t = new Timer() {
        public void run() {
            assertNotNull("ResourceDeletedEvent not received", deletedEvent);
            finishTest();
        }
    };

    delayTestFinish(3000);
    t.schedule(100);
}

Summary
This article has shown you how I develop and test GWT Client Services. If RestyGWT supported overlay types, there's a good chance I could change my service implementation to use it and I wouldn't have to change my test. Robert Cooper, author of GWT in Practice, claims he has a framework that does this. Here's to hoping this article stimulates the GWT ecosystem and we get a GWT REST framework that's as easy to use as GWT RPC.

Update: Today I enhanced this code to use Generics-based classes (inspired by Don't repeat the DAO!) for the boiler-plate CRUD code in a service. In a nutshell, a service interface can now be written as:

public interface FooService extends GenericService<Foo, String> {
 
}

The implementation class is responsible for the URL and converting the JSON result to an object:

public class FooServiceImpl extends GenericServiceImpl<Foo, String> implements FooService {

    @Inject
    public FooServiceImpl(EventBus eventBus) {
        super(eventBus, "/services/foo");
    }

    @Override
    protected Foo convertResultToModel(Representation result) {
        return new Foo(JSOModel.fromJson(result.getData()));
    }
}

I'm sure this can be further enhanced to get rid of the need to create classes altogether, possibly leveraging GIN or some sort of factory. The parent classes referenced in this code can be viewed at the following URLs:

There's also a GenericServiceGwtTest.java that proves it all works as expected.

Posted in Java at Oct 21 2009, 06:55:17 AM MDT 6 Comments

Packaging a SOFEA Application for Distribution

The project I'm working on is a bit different from those I'm used to. I'm used to working on web applications that are hosted on servers and customers access with their browser. SaaS if you will. My current client is different. They're a product company that sells applications and distributes them to customers via download and CD. Their customers install these applications on internal servers (supported servers include WebSphere, WebLogic and Tomcat).

The product I'm currently working on is structured as a SOFEA application and therefore consists of two separate modules - a backend and a frontend. Since it's installed in a servlet container, both modules are WARs and can be installed separately.

Building the backend and frontend as separate projects makes a lot of sense for two reasons:

  • In development, different teams can work on the frontend and backend projects.
  • Having them as separate projects allows them to be versioned separately.

However, having them as two separate projects does make it a bit more difficult for distribution. I'm writing this post to show you how I recently added support for distributing our application as 2 WARs or 1 WAR using the power of Maven, war overlays and the UrlRewriteFilter.

Project Setup
First of all, we have several different Maven modules, but the most important ones are as follows:

  • product-services
  • product-client
  • product-integration-tests

Of course, our modules aren't really named "product", but you get the point. The services project is really just a WAR project with Spring Security configured. It depends on other JAR modules that the services exist in. The client project is a GWT WAR that has a proxy servlet defined in its web.xml that makes it easier to develop. It also contains some UrlRewrite configuration that allows GWT Log's Remote Logging feature to work. The proxy servlet is something we don't want to ship with our product, so we have a separate web.xml for production vs. development. We do the substitution using the maven-war-plugin:

<plugin>
    <groupId>org.apache.maven.plugins</groupId>
    <artifactId>maven-war-plugin</artifactId>
    <version>2.0.2</version>
    <configuration>
        <!-- Production web.xml -->
        <webXml>src/main/resources/web.xml</webXml>
        <warSourceDirectory>war</warSourceDirectory>
        <!-- Exclude everything but urlrewrite JAR -->
        <warSourceExcludes>
            WEB-INF/lib/aop**,WEB-INF/lib/commons-**,WEB-INF/lib/gin-**,
            WEB-INF/lib/guice-**,WEB-INF/lib/gwt-**,WEB-INF/lib/gxt-**,
            WEB-INF/lib/junit-**
        </warSourceExcludes>
    </configuration>
</plugin>

I could exclude WEB-INF/lib/** and WEB-INF/classes/**, but in my particular project, we still want UrlRewrite in standalone mode, and we have some i18n properties files in WEB-INF/classes that are served up for Selenium tests.

With this configuration, we have a services WAR and a client WAR that can be installed and used by clients. To collapse them into one and make it possible to ship a single war, I turned to our product-integration-tests module. This module contains Selenium tests that test both types of distributions.

Merging 2 WARs into 1
The most important thing in the product-integration-tests module is that it creates a single WAR. First of all, it uses <packaging>war</packaging> to make this possible. The rest is done using the following 3 steps.

1. Its dependencies include the client and servlet WARs (and Selenium RC for testing).

<dependencies>
    <dependency>
        <groupId>com.company.app</groupId>
        <artifactId>product-services</artifactId>
        <version>1.0-SNAPSHOT</version>
        <type>war</type>
    </dependency>
    <dependency>
        <groupId>com.company.app</groupId>
        <artifactId>product-client</artifactId>
        <version>1.0-SNAPSHOT</version>
        <type>war</type>
    </dependency>
    <dependency>
        <groupId>org.seleniumhq.selenium.client-drivers</groupId>
        <artifactId>selenium-java-client-driver</artifactId>
        <version>1.0.1</version>
        <scope>test</scope>
    </dependency>
</dependencies>

2. The WAR created excludes the "integration-tests" part of the name:

<build>
    <finalName>product-${project.version}</finalName>
    ...
</build>

3. WAR overlays are configured so the everything in the client's WEB-INF directory is excluded from the merged WAR.

<plugin>
    <artifactId>maven-war-plugin</artifactId>
    <configuration>
        <!-- http://maven.apache.org/plugins/maven-war-plugin/overlays.html -->
        <overlays>
            <overlay>
                <groupId>com.company.app</groupId>
                <artifactId>product-services</artifactId>
                <excludes>
                    <!-- TODO: Rename to api.html (this is the Enunciate-generated documentation) -->
                    <exclude>index.html</exclude>
                </excludes>
            </overlay>
            <!-- No server needed in product-client -->
            <overlay>
                <groupId>com.company.app</groupId>
                <artifactId>product-client</artifactId>
                <excludes>
                    <exclude>WEB-INF/**</exclude>
                </excludes>
            </overlay>
            <!-- Only include META-INF/context.xml to set the ROOT path -->
            <overlay>
                <excludes>
                    <exclude>WEB-INF/**</exclude>
                </excludes>
            </overlay>
        </overlays>
    </configuration>
</plugin>

That's it! Using this configuration, it's possible to distribute a Maven-based SOFEA project as single or multiple WARs. However, there are some nuances.

One thing you might notice is the reference to META-INF/context.xml in the overlays configuration. This subtly highlights one issue I experienced when merging the WARs. In our GWT client, we're using URLs that point to our services at /product-services/*. This works in development (via a proxy servlet) and when the WARs are installed separately - as long as the services WAR is installed at /product-services. However, when they're merged, a little URL rewriting needs to happen. To do this, I added the UrlRewriteFilter to the product-services module and configured a simple rule.

<?xml version="1.0" encoding="UTF-8"?>
<!DOCENGINE urlrewrite PUBLIC "-//tuckey.org//DTD UrlRewrite 3.0//EN"
        "http://tuckey.org/res/dtds/urlrewrite3.0.dtd">

<urlrewrite use-query-string="true">
    <!-- Used when services are merged into WAR with GWT client -->
    <rule>
        <from>^/product-services/(.*)$</from>
        <to type="forward">/$1</to>
    </rule>
</urlrewrite>

Because the services URLs point to the root (/product-services), the merged WAR has to be installed as the ROOT application. When you're using Cargo with Tomcat and want to deploy to ROOT, you have to have a META-INF/context.xml with a path="" reference (ref: CARGO-516).

<Context path=""/>

It is possible to change the URLs in the client to be relative, but this gets seems to get messy when you're using separate WARs. When using relative URLs, I found I had to do solution using cross-context forwarding to get the results I wanted. Using a redirect instead of a forward worked, but resulted in the client talking to the server twice (once to get redirected, a second time for the actual call). Cross-context forwarding is supported by the UrlRewriteFilter and Tomcat, but I'm not sure WebSphere or WebLogic support it. The best solution is probably to change the URLs dynamically at runtime, possibly using some sort of deferred binding technique.

Testing with Cargo and Selenium
Once I had everything merged, I wanted to configure Cargo and Selenium to allow testing both distribution types. If I installed all 3 wars at the same time, the "product-services" WAR would be used by both the product-client.war and the product.war, so I had to use profiles to allow installing the single merged WAR or both WARs. Below is the profile I used for starting Cargo, deploying the merged WAR, starting Selenium RC and running Selenium tests.

<properties>
    <cargo.container>tomcat6x</cargo.container>
    <cargo.container.url>
        http://archive.apache.org/dist/tomcat/tomcat-6/v6.0.20/bin/apache-tomcat-6.0.20.zip
    </cargo.container.url>
    <cargo.host>localhost</cargo.host>
    <cargo.port>23433</cargo.port>
    <cargo.wait>false</cargo.wait>
    <cargo.version>1.0</cargo.version>

    <!-- *safari and *iexplore are additional options -->
    <selenium.browser>*firefox</selenium.browser>
</properties>
...
<profile>
    <id>itest-bamboo</id>
    <activation>
        <activeByDefault>false</activeByDefault>
    </activation>
    <build>
        <plugins>
            <plugin>
                <groupId>org.codehaus.cargo</groupId>
                <artifactId>cargo-maven2-plugin</artifactId>
                <version>${cargo.version}</version>
                <configuration>
                    <wait>${cargo.wait}</wait>
                    <container>
                        <containerId>${cargo.container}</containerId>
                        <log>${project.build.directory}/${cargo.container}/cargo.log</log>
                        <zipUrlInstaller>
                            <url>${cargo.container.url}</url>
                            <installDir>${installDir}</installDir>
                        </zipUrlInstaller>
                    </container>
                    <configuration>
                        <home>${project.build.directory}/${cargo.container}/container</home>
                        <properties>
                            <cargo.hostname>${cargo.host}</cargo.hostname>
                            <cargo.servlet.port>${cargo.port}</cargo.servlet.port>
                        </properties>
                        <!-- Deploy as ROOT since XHR requests are made to /product-services -->
                        <deployables>
                            <deployable>
                                <properties>
                                    <context>ROOT</context>
                                </properties>
                            </deployable>
                        </deployables>
                    </configuration>
                </configuration>
                <executions>
                    <execution>
                        <id>start-container</id>
                        <phase>pre-integration-test</phase>
                        <goals>
                            <goal>start</goal>
                        </goals>
                    </execution>
                    <execution>
                        <id>stop-container</id>
                        <phase>post-integration-test</phase>
                        <goals>
                            <goal>stop</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <groupId>org.codehaus.mojo</groupId>
                <artifactId>selenium-maven-plugin</artifactId>
                <version>1.0</version>
                <executions>
                    <execution>
                        <phase>pre-integration-test</phase>
                        <goals>
                            <goal>start-server</goal>
                        </goals>
                        <configuration>
                            <background>true</background>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-surefire-plugin</artifactId>
                <executions>
                    <execution>
                        <phase>integration-test</phase>
                        <goals>
                            <goal>test</goal>
                        </goals>
                        <configuration>
                            <excludes>
                                <exclude>none</exclude>
                            </excludes>
                            <includes>
                                <include>**/*SeleniumTest.java</include>
                            </includes>
                            <systemProperties>
                                <property>
                                    <name>selenium.browser</name>
                                    <value>${selenium.browser}</value>
                                </property>
                                <property>
                                    <name>cargo.port</name>
                                    <value>${cargo.port}</value>
                                </property>
                            </systemProperties>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>
</profile>

This profile is run by our Bamboo nightly tests with mvn install -Pitest-bamboo. The 2nd profile I added doesn't install the project's WAR, but instead installs the two separate WARs. Running mvn install -Pitest-bamboo,multiple-wars executes the Selenium tests against the multi-WAR distribution.

<profile>
    <id>multiple-wars</id>
    <activation>
        <activeByDefault>false</activeByDefault>
    </activation>
    <build>
        <plugins>
            <plugin>
                <groupId>org.codehaus.cargo</groupId>
                <artifactId>cargo-maven2-plugin</artifactId>
                <version>${cargo.version}</version>
                <configuration>
                    <configuration>
                        <home>${project.build.directory}/${cargo.container}/container</home>
                        <properties>
                            <cargo.hostname>${cargo.host}</cargo.hostname>
                            <cargo.servlet.port>${cargo.port}</cargo.servlet.port>
                        </properties>
                        <deployables>
                            <deployable>
                                <groupId>com.company.app</groupId>
                                <artifactId>product-client</artifactId>
                                <pingURL>http://${cargo.host}:${cargo.port}/product-client/index.html</pingURL>
                                <type>war</type>
                                <properties>
                                    <context>/product-client</context>
                                </properties>
                            </deployable>
                            <deployable>
                                <groupId>com.company.app</groupId>
                                <artifactId>product-services</artifactId>
                                <pingURL>
                                    http://${cargo.host}:${cargo.port}/project-services/index.jspx
                                </pingURL>
                                <type>war</type>
                                <properties>
                                    <context>/product-services</context>
                                </properties>
                            </deployable>
                        </deployables>
                    </configuration>
                </configuration>
            </plugin>
        </plugins>
    </build>
</profile>

I won't be including any information on authoring Selenium tests because there's already many good references. I encourage you to checkout the following if you're looking for Selenium testing techniques.

Summary
This article has shown you how I used Maven, war overlays and the UrlRewriteFilter to allow create different distributions of a SOFEA application. I'm still not sure which packaging (1 WAR vs. 2) mechanism is best, but it's nice to know there's options. If you package and distribute SOFEA applications, I'd love to hear about your experience in this area.

Posted in Java at Oct 06 2009, 01:17:38 AM MDT 2 Comments

Building GWT Applications with MVP and Issues with Overlay Types

MVP has recently become a popular strategy for structuring GWT applications. This is largely due to its testability and Ray Ryan's Best Practices For Architecting Your GWT App from this year's Google I/O. GWT, by itself, is simply a widget toolkit and doesn't ship with any sort of MVC (or MVP) framework.

On my current project, we're using GXT, a GWT implementation based on ExtJS. It has its own MVC framework, but it has very little documentation and can be confusing when using it with GWT's History management. At one point, I attempted to make it more understandable by writing a blog entry on GXT's MVC Framework.

One of my initial assignments was to decide if we should use MVP or MVC. Regardless of which one was chosen, I was also tasked with deciding if we should use an existing framework or write our own. After watching Ray Ryan's session on YouTube and recalling my frustration with GXT MVC on my last project, I quickly became convinced MVP was the answer.

To test my "MVP is best for our project" theory, I did a spike to implement it. I used the GWT MVP Example tutorial as a starting point and added the following libraries to my project.

Implementing the MVP pattern itself was relatively straightforward, but I did encounter a few issues. I'm writing this post to see if anyone has solved these issues.

MVP Implementation Issues
The first issue I ran across was GXT's widgets don't implement standard GWT interfaces. To try and figure out a solution, I posted the following on Twitter:

"Wondering if it's possible to do MVP with GXT since it's buttons don't implement standard GWT interfaces."

The best response I received was from Simon Stewart (founder of the WebDriver project, works for Google):

"Put the GXT buttons in the View. Let that turn DOM events into semantic events."

He also pointed me to his tdd-gwt-gae project which shows many techniques for unit testing (with jMock) and integration testing (with GWTTestCase). Using Simon's examples, I was able to determine an initial strategy for implementing MVP with GXT.

My strategy is instead of getting widgets from the view and adding handlers, you add handlers to to the view and it takes care of adding them to the widgets that should listen for them. This seems to work, but my View interface has a lot of void methods, which is a bit different than standard MVP patterns I've seen.

The 2nd issue I encountered is with unit testing. Unit testing can be be performed on MVP applications by mocking out any dependencies that use JSNI. Classes that use JSNI are not testable with plain ol' JUnit and typically requires you to use GWTTestCase, which can be slow and cumbersome. This isn't to say that GWTTestCase isn't useful, just that it has its place.

When unit testing MVP applications, the recommended practice seems to be you should test presenters and not views. Services fall into a similar "don't unit test" category because they'll need to connect to the server-side, which won't be running in a unit testing environment.

This is where I ran into a major issue that I don't have a solution for.

I'm using Overlay Types (as described in JSON Parsing with JavaScript Overlay Types) to do JSON parsing in callbacks. Since Overlay Types use JSNI, it's not possible to do any JSON parsing in unit tests. The problem with not being able to do any JSON parsing is the callbacks will often call eventBus.fireEvent(GwtEvent) after the JSON parsing has happened. This means I can't fully test the flow of a presenter if event firing happens in a callback.

In attempt to try different mocking techniques for callbacks, I created a test that uses two recommended EasyMock-ing strategies. The first is a "CallbackSuccessMatcher" and is described in more detail in Testing GWT without GwtTestCase. The second technique uses a "CallbackMockSupport" class to allow EasyMock expectations such as expectLastCallAsync() and expectLastCallAsyncSuccess(T). You can read more about this technique in Test driven development for GWT UI code with asynchronous RPC.

Both of these examples use RPC, which typically has callbacks that have an onSuccess(T type) method. It's easy to use these callbacks in unit tests since T is a POJO and the onSuccess() method contains no JSNI code.

Currently, I see a few possible solutions to this problem:

  • Figure out a way to detect when unit tests are running and add if/else logic to callbacks.
  • Modify presenters and services so a callback can be set that is unit test-friendly.
  • Make JSOModel an interface that can be replaced/mocked in tests.

The last solution seems like best one, but I'm also curious to know what others are doing. My hunch is that most GWT apps use RPC and haven't run into this issue.

Posted in Java at Sep 22 2009, 01:41:36 PM MDT 17 Comments

Concurrency on the JVM Using Scala with Venkat Subramaniam

This evening, I attending the Denver JUG where Venkat Subramaniam was speaking about Scala. Unfortunately, I arrived halfway through his Programming Scala talk and didn't get a chance to learn as much as I wanted to. What I did see made Scala look very powerful and (possibly) easier to learn than Java. Below are my notes from Venkat's talk.

Concurrency is important these days because we're in a world of multiple processors. When you have multiple threads running at one time, it can become painful. Before Java, you had to learn the API for multi-threading for each different platform. With Java's "Write once, debug everywhere", you only had to learn one API. Unfortunately, it's pretty low level: how to start a thread, manage it, stop it, etc. You also have to remember where to put synchronize in your code.

With Scala, immutability and its Actors make it easy to program concurrent systems. For example, here's a web service that retrieves stock prices in sequential order:

def getyearEndClosing(symbol : String, year : Int) = {
  val url = "http://ichart.finance.yahoo.com/table.csv?s=" + 
    symbol + "&a=11&b=01&c" + year + "&d=11&e=31&f=" + year + "&g=m"
  val data = io.Source.fromURL(url).mkString
  val price = data.split("\n")(1).split(",")(4).toDouble
  Thread.sleep(1000); // slow down internet
  (symbol, price)
}

val symbols = List("APPL", "GOOG", "IBM", "JAVA", "MSFT")

val start = System.nanoTime

val top = (("", 0.0) /: symbols) { (topStock, symbol) => 
  val (sym, price) = getYearEndClosing(symbol, 2008)

  if (topStock._2 < price) (sym, price) else topStock
}

val end = System.nanoTime

println("Top stock is " + top._1 + " with price " + top._2)
println("Time taken " + (end - start)/10000000000.0)

To make this concurrent, we create Actors. Actors are nothing but Threads with a built-in message queue. Actors allow spawning separate threads to retrieve each stock price. Instead of doing:

symbols.foreach { symbol => 
  getYearEndClosing(symbol, 2008)
}

You can add actors:

val caller = self

symbols.foreach { symbol => 
  actor { caller ! getYearEndClosing(symbol, 2008) }
}

Then remove val (sym, price) = getYearEndClosing(symbol, 2008) and replace it with:

  receive {
    case(sym: String, price: Double) =>
      if (topStock._2 < price) (sym, price) else topStock
  }

After making this change, the time to execute the code dropped from ~7 seconds to ~2 seconds. Also, since nothing is mutable in this code, you don't have to worry about concurrency issues.

With Scala, you don't suffer the multiple-inheritance issues you do in Java. Instead you can use Traits to do mixins. For example:

import scala.actors._
import Actor._

class MyActor extends Actor {
  def act() {
    for(i <- 1 to 3) {
    receive {
      case msg => println("Got " + msg)
    }
  }
}

When extending Actor, you have to call MyActor.start to start the Actor. Writing actors this way is not recommended (not sure why, guessing because you have to manually start them).

Venkat is now showing an example that counts prime numbers and he's showing us how it pegs the CPU when counting how many exist between 1 and 1 million (78,499). After adding actor and receive logic, he shows how his Activity Monitor shows 185% CPU usage, indicating that both cores are being used.

What happens when one of the threads crashes and burns? The receive will wait forever. Because of this, using receive is a bad idea. It's much better to use receiveWithin(millis) to set a timeout. Then you can catch the timeout in the receiveWithin block using:

case TIMEOUT => println("Uh oh, timed out")

A more efficient way to use actors is using react instead of receive. With react, threads leave after putting the message on the queue and new threads are started to execute the block when the message is "reacted" to. One thing to remember with react is any code after the react block will never be executed. Just like receiveWithin(millis), you can use reactWithin(millis) to set a timeout.

The major thing I noticed between receive and react is Venkat often had to change the method logic to use react. To solve this, you can use loop (or better yet, loopWhile(condition)) to allow accessing the data outside the react block. In conclusion, reactWithin(millis) is best to use, unless you need to execute code after the react block.

Conclusion
This was a great talk by Venkat. He used TextMate the entire time to author and execute all his Scala examples. Better yet, he never used any sort of presentation. All he had was a "todo" list with topics (that he checked off as he progressed) and a sample.scala file.

Personally, I don't plan on using Scala in the near future, but that's mostly because I'm doing UI development and GWT and JavaScript are my favorite languages for that. On the server-side, I can see how it reduces the amount of Java you need to write (the compiler works for you instead of you working for the compiler). However, my impression is its sweet spot is when you need to easily author an efficient concurrent system.

If you're looking to learn Scala, I've heard Scala by Example (PDF) is a great getting-started resource. From there, I believe Programming in Scala and Venkat's Programming Scala are great books.

Posted in Java at Sep 09 2009, 09:12:48 PM MDT 8 Comments

My Experience with Java REST Frameworks (specifically Jersey and CXF)

Recently I was tasked with developing a server-side REST strategy for a client. When I started working with them, they were using GWT RPC for exposing services. They wanted to move to RESTful services to allow for a more open platform, that multiple types of clients could talk to. There were interested in supporting SOAP and GWT RPC as well, but it wasn't required. They are using Spring, a custom namespace (for easily creating remote services) and an HTML documentation generator to expose their API in human readable form.

When I first starting developing, I chose to try Enunciate. From Enunciate's homepage:

Enunciate is an engine for creating, maintaining, and deploying your rich Web service API on the Java platform. If that sounds complicated, it's not. All you have to do is define your service interfaces in Java source code. ... Then invoke Enunciate.

Sounds pretty sweet, eh? At first glance, the things I liked about Enunciate were:

  1. The ability to generate multiple endpoints (and clients).
  2. Generates nice-looking documentation.
  3. Allows selecting different frameworks (for example, CXF instead of JAX-WS RI).

Initially, the hardest part of using Enunciate was integrating it into my project. This was somewhat related to having a multi-module project, but moreso related to getting the settings right in my enunciate.xml file. After getting everything working, I encountered a few Spring wiring issues with the GWT Endpoints and with Jersey's Spring support.

The good news is I believe most of these issues were related to my project and how it proxies Spring beans that Jersey couldn't find. Jersey's Spring support only supports annotations at this time, so I was unable to tell it the proxied bean names via XML (or even its own annotations). I'm sure this problem is solvable, but after struggling with it for a day or two, I decided to give up on Enunciate. I had to get something working, and fast.

At this point, I was back to the drawing board. I knew there were plenty of good Java REST frameworks available, but Spring and CXF (for SOAP) were already available dependencies in my project, so I chose that route. I was able to get something up and running fairly quickly with this tutorial and CXF's JAX-RS documentation. I ran into an Invalid JSON Namespace issue, but was able to solve it by adding a custom namespace.

It's not all rosy though, there are still a couple CXF issues I haven't solved:

While CXF does take a bit of XML for each service, it's kinda slick in that it only requires you to annotate your service interfaces. It also generates documentation for your services (WSDL and WADL), but it's not as pretty as Enunciate's. To solve this, I added Enunciate for documentation only. To make Enunciate work, I did have to add some annotations to my service implementation classes and parse the generated HTML to fix some links to CXF's services, but ultimately it works pretty well.

In the end, I recommended my client not use Enunciate for generating endpoints. This was primarily related to their unique Spring configuration, but also because I was able to easily use the same classes for REST, SOAP and GWT Endpoints. However, I will continue to keep my eye on Enunciate. It could be very useful for the upcoming appfuse-ws archetype. I'm also eager to see better GWT support and the ability to generate Overlay types in future releases.

Posted in Java at Aug 27 2009, 02:20:51 PM MDT 15 Comments

My attempt at browser-based username/password autocomplete with GWT

Last night, I did a quick spike to try and implement username/password autocomplete in my GXT application. By "autocomplete", I don't mean Ajax-style autocomplete, but rather browser-based autocomplete. The best information I found on doing this is in the following post:

http://osdir.com/ml/GoogleWebToolkit/2009-04/msg01838.html

I didn't use this technique (wrapping an existing form with a FormPanel) because I'm using GXT and didn't want to lose the look-and-feel of my login form.

I was successful in getting everything to work in Firefox (it populates both the username and password). In IE, it only populates the username, not the password. In Safari/Chrome, it doesn't work at all. Here's how I did it:

1. Created a hidden HTML form on my HTML page that embeds GWT.

<form method="post" action="javascript:void(0)" style="display: none">
    <input type="text" id="username" name="username" value=""/>
    <input type="password" id="password" name="password" value=""/>
    <input type="submit" value="Login" id="login"/>
</form>

2. When a user clicks on the "Login" button in my GWT application, populate the fields in this hidden form and "click" on the Login button (which will do nothing since the action="javascript:void(0)").

// Set the hidden fields to trigger the browser to remember
DOM.getElementById("username").setAttribute("value", username.getValue());
DOM.getElementById("password").setAttribute("value", password.getValue());
clickFormLogin();

...

public static native void clickFormLogin() /*-{
$doc.getElementById("login").click();
}-*/;

This works in Firefox 3.5 and prompts me to save the user/pass at the top of the screen. Why doesn't this work in Safari/Chrome? My guess is because the form's action doesn't go anywhere and the form is not submitted. If I change the action to be an actual URL and show the form, clicking on the form's Login button will save it in those browsers.

Grabbing the actual populated values when the screen loads did turn out to be a bit tricky. With IE, I was able to grab the username, but not the password. It seems that keyboard or mouse interaction (tabbing off or focusing/blurring) with the username field is necessary to get the password field populated. In Firefox, I was unable to grab the values if I did it straight away. The solution turned out to be wrapping everything in a Timer and waiting a 1/2 second.

// grab user/pass from browser and hidden login form (if user has stored them)
// runs 1/2 second after page loads so cursor goes at end of username and works in Firefox
Timer t = new Timer() {
    public void run() {
        username.setValue(getElementValue("username"));
        password.setValue(getElementValue("password"));
        if (username.getValue() != null) {
            password.focus();
        }
        validate();
    }
};

t.schedule(500);

...

public static native String getElementValue(String domId) /*-{
    return $doc.getElementById(domId).value;
}-*/;

While I'm glad I got it working in Firefox, I'm disappointed with IE's lack of password autocompletion. More than anything, I can't help but think there's a way to make this work in WebKit-based browsers. If you've successfully implemented cross-browser username/password autocomplete in GWT, please share your experience. If you share your experience about successfully doing it in Safari/Chrome, I might even buy you a beer or two. ;-)

Posted in Java at Aug 07 2009, 08:09:11 AM MDT 2 Comments

Integrating GWT with Spring Security

Yesterday, I wrote about How to do cross-domain GWT RPC with a ProxyServlet. Today I'll be discussing how to modify the ProxyServlet to authenticate with Spring Security. For the application I'm working on, the ProxyServlet is only used in development (when running GWT's hosted mode) and isn't necessary when deploying the client and server on the same server. Using the ProxyServlet allows cross-domain requests so you can run GWT in hosted mode and talk to your backend running on another server. This setup can be especially handy in that you can easily point your hosted client at different backends (for example, if you have testing and staging environments).

In this example, the backend application is a JSF/Spring application that has Spring Security wired in to protect services with both Basic and Form-based authentication. Basic authentication will kick in if a "Authorization" header is sent, otherwise Form-based authentication is used. Here's the Spring Security context file that makes this happen:

<?xml version="1.0" encoding="UTF-8"?>

<beans:beans xmlns="http://www.springframework.org/schema/security"
             xmlns:beans="http://www.springframework.org/schema/beans"
             xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
             xsi:schemaLocation="...">

    <http auto-config="true" realm="My Web Application">
        <intercept-url pattern="/faces/welcome.jspx" access="ROLE_USER"/>
        <intercept-url pattern="/*.rpc" access="ROLE_USER"/>
        <http-basic/>
        <form-login login-page="/faces/login.jspx" authentication-failure-url="/faces/accessDenied.jspx"
                    login-processing-url="/j_spring_security_check" default-target-url="/redirect.jsp"
                    always-use-default-target="true"/>
    </http>

    <authentication-provider>
        <user-service >
            <user name="admin" password="admin" authorities="ROLE_USER"/>
        </user-service>
    </authentication-provider>
</beans:beans>

The easiest way to configure your GWT application to talk to a Spring Security protected resource is to protect your HTML page that GWT is embedded in. This is the documented way to integrate GWT with Spring Security (ref: GWT's LoginSecurityFAQ, search for "Acegi"). This works well for production, but not for hosted-mode development.

Basic Authentication
To authenticate with Basic Authentication, you can use GWT's RequestBuilder and set an "Authentication" header that contains the user's (base64-encoded) credentials.

private class LoginRequest {
    public LoginRequest(RequestCallback callback) {
        String url = "/services/faces/welcome.jspx";

        RequestBuilder rb = new RequestBuilder(RequestBuilder.POST, url);
        rb.setHeader("Authorization", createBasicAuthToken());
        rb.setCallback(callback);
        try {
            rb.send();
        } catch (RequestException e) {
            Window.alert(e.getMessage());
        }
    }
}

protected String createBasicAuthToken() {
    byte[] bytes = stringToBytes(username.getValue() + ":" + password.getValue());
    String token = Base64.encode(bytes);
    return "Basic " + token;
}

protected byte[] stringToBytes(String msg) {
    int len = msg.length();
    byte[] bytes = new byte[len];
    for (int i = 0; i < len; i++)
        bytes[i] = (byte) (msg.charAt(i) & 0xff);
    return bytes;
}

To use this LoginRequest class, create it with a callback and look for a 401 response code to determine if authentication failed.

new LoginRequest(new RequestCallback() {
    public void onResponseReceived(Request request, Response response) {
        if (response.getStatusCode() != Response.SC_UNAUTHORIZED &&
                response.getStatusCode() != Response.SC_OK) {
            onError(request, new RequestException(response.getStatusText() + ":\n" + response.getText()));
            return;
        }

        if (response.getStatusCode() == Response.SC_UNAUTHORIZED) {
            Window.alert("You have entered an incorrect username or password. Please try again.");
        } else {
            // authentication worked, show a fancy dashboard screen
        }
    }

    public void onError(Request request, Throwable throwable) {
        Window.alert(throwable.getMessage());
    }
});

If your GWT application is included in the "services" war, everything should work at this point. However, if you try to login with invalid credentials, your browser's login dialog will appear. To suppress this in the aforementioned ProxyServlet, you'll need to make a change in its executeProxyRequest() method so the "WWW-Authenticate" header is not copied.

// Pass the response code back to the client
httpServletResponse.setStatus(intProxyResponseCode);

// Pass response headers back to the client
Header[] headerArrayResponse = httpMethodProxyRequest.getResponseHeaders();
for (Header header : headerArrayResponse) {
    if (header.getName().equals("Transfer-Encoding") && header.getValue().equals("chunked") ||
            header.getName().equals("Content-Encoding") && header.getValue().equals("gzip") ||
            header.getName().equals("WWW-Authenticate")) { // don't copy WWW-Authenticate header
    } else {
        httpServletResponse.setHeader(header.getName(), header.getValue());
    }
}

I'm not sure how to suppress the browser prompt when not using the ProxyServlet. If you have a solution, please let me know.

Basic Authentication works well for GWT applications because you don't need additional logic to retain the authenticated state after the initial login. While Basic Authentication over SSL might offer a decent solution, the downside is you can't logout. Form-based Authentication allows you to logout.

Form-based Authentication

Before I show you how to implement form-based authentication, you should be aware that Google does not recommend this. Below is a warning from their LoginSecurityFAQ.

Do NOT attempt to use the Cookie header to transfer the sessionID from GWT to the server; it is fraught with security issues that will become clear in the rest of this article. You MUST transfer the sessionID in the payload of the request. For an example of why this can fail, see CrossSiteRequestForgery.

In my experiment, I didn't want to change the server-side Spring Security configuration, so I ignored this warning. If you know how to configure Spring Security so it looks for the sessionID in the payload of the request (rather than in a cookie), I'd love to hear about it. The upside of the example below is it should work with container-managed authentication as well.

The LoginRequest class for form-based authentication is similar to the previous one, except it has a different URL and sends the user's credentials in the request body.

private class LoginRequest {
    public LoginRequest(RequestCallback callback) {
        String url = "/services/j_spring_security_check";

        RequestBuilder rb = new RequestBuilder(RequestBuilder.POST, url);
        rb.setHeader("Content-Type", "application/x-www-form-urlencoded");
        rb.setRequestData("j_username=" + URL.encode(username.getValue()) +
                    "&j_password=" + URL.encode(password.getValue()));

        rb.setCallback(callback);
        try {
            rb.send();
        } catch (RequestException e) {
            Window.alert(e.getMessage());
        }
    }
}

If you deploy your GWT application in the same WAR your services are hosted in, this is all you'll need to do. If you're using the ProxyServlet, there's a couple of changes you'll need to make in order to set/send cookies when running in hosted mode.

First of all, you'll need to make sure you've configured the servlet to follow redirects (by subclassing or simply modifying its default). After that, add the following logic on line 358 (or just look for "if (followRedirects)") to expose the sessionID to the client. The most important part is setting the cookie's path to "/" so the client (running at localhost:8888) can see it.

if (followRedirects) {
    // happens on first login attempt
    if (stringLocation.contains("jsessionid")) { 
        Cookie cookie = new Cookie("JSESSIONID",
                stringLocation.substring(stringLocation.indexOf("jsessionid=") + 11));
        cookie.setPath("/");
        httpServletResponse.addCookie(cookie);
    // the following happens if you refresh your GWT app after already logging in once
    } else if (httpMethodProxyRequest.getResponseHeader("Set-Cookie") != null) {
        Header header = httpMethodProxyRequest.getResponseHeader("Set-Cookie");
        String[] cookieDetails = header.getValue().split(";");
        String[] nameValue = cookieDetails[0].split("=");

        Cookie cookie = new Cookie(nameValue[0], nameValue[1]);
        cookie.setPath("/");
        httpServletResponse.addCookie(cookie);
    }
    httpServletResponse.sendRedirect(stringLocation.replace(getProxyHostAndPort() +
            this.getProxyPath(), stringMyHostName));
    return;
}

Click here to see a screenshot of the diff of the ProxyServlet after this code has been added.

Figuring out that headers needed to be parsed after authenticating successfully and before redirecting was the hardest part for me. If you grab the JSESSIONID from the "Set-Cookie" header anywhere else, the JSESSIONID is one that hasn't been authenticated. While the login will work, subsequent calls to services will fail.

To make subsequent calls with the cookie in the header, you'll need to make an additional modification to ProxyServlet to send cookies as headers. First of all, add a setProxyRequestCookies() method:

/**
 * Retrieves all of the cookies from the servlet request and sets them on
 * the proxy request
 *
 * @param httpServletRequest The request object representing the client's
 *                            request to the servlet engine
 * @param httpMethodProxyRequest The request that we are about to send to
 *                                the proxy host
 */
@SuppressWarnings("unchecked")
private void setProxyRequestCookies(HttpServletRequest httpServletRequest, 
                                    HttpMethod httpMethodProxyRequest) {
    // Get an array of all of all the cookies sent by the client
    Cookie[] cookies = httpServletRequest.getCookies();
    if (cookies == null) {
        return;
    }
    
    for (Cookie cookie : cookies) {
        cookie.setDomain(stringProxyHost);
        cookie.setPath(httpServletRequest.getServletPath());
        httpMethodProxyRequest.setRequestHeader("Cookie", cookie.getName() +  
                "=" + cookie.getValue() + "; Path=" + cookie.getPath());
    }
}

Next, in the doGet() and doPost() methods, add the following line just after the call to setProxyRequestHeaders().

setProxyRequestCookies(httpServletRequest, getMethodProxyRequest);
 

After making these modifications to ProxyServlet, you can create LoginRequest and attempt to authenticate. To detect a failed attempt, I'm looking for text in Spring Security's "authentication-failure-url" page.

new LoginRequest(new RequestCallback() {

    public void onResponseReceived(Request request, Response response) {
        if (response.getStatusCode() != Response.SC_OK) {
            onError(request, new RequestException(response.getStatusText() + ":\n" + response.getText()));
            return;
        }
        
        if (response.getText().contains("Access Denied")) {
            Window.alert("You have entered an incorrect username or password. Please try again.");
        } else {
            // authentication worked, show a fancy dashboard screen
        }
    }

    public void onError(Request request, Throwable throwable) {
        Window.alert(throwable.getMessage());
    }
});

After making these changes, you should be able to authenticate with Spring Security's form-based configuration. While this example doesn't show how to logout, it should be easy enough to do by 1) deleting the JSESSIONID cookie or 2) calling the Logout URL you have configured in your services WAR.

Hopefully this howto gives you enough information to configure your GWT application to talk to Spring Security without modifying your existing backend application. It's entirely possible that Spring Security offers a more GWT-friendly authentication mechanism. If you know of a better way to integrate GWT with Spring Security, I'd love to hear about it.

Update on October 7, 2009: I did some additional work on this and got Remember Me working when using form-based authentication. I found I didn't need as much fancy logic in my ProxyServlet and was able to reduce the "followRequests" logic to the following:

if (followRedirects) {
    if (httpMethodProxyRequest.getResponseHeader("Set-Cookie") != null) {
        Header[] headers = httpMethodProxyRequest.getResponseHeaders("Set-Cookie");
        if (headers.length == 1) {
            extractCookieFromHeader(httpServletResponse, headers[0]);
        } else {
            // ignore the first header since there always seems two jsessionid headers
            // and the 2nd is the valid one
            for (int i = 1; i < headers.length; i++) {
                extractCookieFromHeader(httpServletResponse, headers[i]);
            }
        }
    }
    httpServletResponse.sendRedirect(
            stringLocation.replace(getProxyHostAndPort() + getProxyPath(), stringMyHostName));
    return;
}

I was also able to remove the setProxyRequestCookies() method completely as it no longer seems necessary.

Next, I'd like to figure out how to make Spring Security more Ajax-friendly where it can read an authentication token in the request body or header (instead of from a cookie). Also, it'd be sweet if I could convince it to return error codes instead of the login page (for example, when a certain header is present).

Posted in Java at Aug 06 2009, 08:50:15 AM MDT 10 Comments