A few weeks ago (January 11th to be exact), I attended a Denver JUG meeting where Scott Davis spoke about Real World Mapping (download presentation) and Holistic Testing (download). Below are my notes from the event.
The first-generation mapping web application was MapQuest. Second generation mapping webapps include Google Maps (now renamed to Google Local) and MSN Virtual Earth (now renamed to Windows Live Local). The funny thing is MapQuest had sattelite images way back when, but they dropped it b/c no one was interested.
I like how googling for "msn virtual earth" brings up earth.google.com as the sponsored link at the top.
MapQuest - simple example from the oldest player in the map space. The URL of the query results gives us our first glimpse into building a custom application. For example:
http://www.mapquest.com/maps/map.adp?address=1600+Pennsylvania+Ave+Nw&city=Washington&state=DC
Is using MapQuest and building URLs on your site hacking? No - it might be considered hacking, but they encourage you to use it by publishing a howto. Once you're familiar with the basics, you can begin to customize the URL, for example:
Add a title: title=The+White House
Change the zoom level: zoom=9 (0-10, 0 == world)
Even use good ol' Latitude/Longitude: ?latlongtype=decimal&latitude&38(whitehouse)
Why use Lat/Long?
While addresses are more user-friendly, lat/long coordinates are more precise. All map services must "geocode" addresses into lat/long anyway in order to show the location on the map.
Did you know that Google is a huge dictionary? Just use "define: geocode". So how do you find your Lat/Long? There are a couple of simple ways to geocode a street address for free:
-
Option #1: URL harvesting from http://maps.msn.com - when you submit an address, the resulting URL has the lat/long in it (in the "C" parameter)
-
Option #2: Use http://geocoder.us. Find the latitude & longitude of any US address - for free. The US Gov't provides the data, they just display it. If you want to use this in your application, you'll have to screen-scrape to get the data.
Geocoder.us also offers a web services interface - RESTful based on URLs. You query for address, it returns XML. There's also SOAP, XML-RPC and JSON interfaces.
RESTful Web Services: If your GET request returns XML instead of HTML, it is called a "RESTful Web Service". To use REST in Java, you can use the HttpClient package from Jakarta Commons.
Yahoo Maps
Yahoo! Maps offers the same level of hackability as MapQuest. The parameter names change, but remain conceptually identical. They use "mag" instead of "zoom", "addr" instead of "address", etc. One thing Yahoo! can do that MapQuest can't is allow you to display your own data on the map (more at http://developer.yahoo.net/maps/).
For example, the URL below displays cities on the NFJS tour:
http://api.maps.yahoo.com/Maps/V1/annotatedMaps?appid=nfjs_cities&xmlsrc=http://www.davisworld.org/nfjs.xml. Appid must be registered with Yahoo, but it is free and the process is fast. Xmlsrc is an XML file describing your map (most of the name/value GET parameters are pushed into XML...). Nice! The document linked to in Xmlsrc is actually an RSS feed.
Web 2.0
The Yahoo! Maps Web Service is getting very close to what the pundits are calling "Web 2.0". Tim O'Reilly calls it "Web 2.0," Jonathan Schwartz calls it "The Participation Age".
What is Web 2.0?
The there are no hard and fast definitions, the same basic ideas come up over and over:
- Favor interactive web applications over static/passive web pages
- Favor data/web services over fixed presentation/HTML
- Allow/encourage your content to be integrated into other applications
2/3rds of things bought and sold on eBay are done through web services - not through their web interface.
Google Maps
Google Maps does have Web 1.0 functionality, with query parameters. However, unlike Yahoo and MapQuest, the names of their parameters are only 1-digit. For example, z == "Zoom", t == "Type" (m=map, k=image, h=hybrid map/image). The reason it's called "k" for images is because Google bought Keyhole. It's now called Google Earth and and apparently the OS X version came out today.
Google also supports gazetteering - which are "natural language" queries. For example, you can search for "colorado rockies denver co" and it'll bring up a picture of Coors Field.
But what moves Googles aquarely into Web 2.0 territory is their Maps API. Using Ajax and Google's API, you can develop your own location-based services using Google Maps. To use Google's API, you have to singup for key, but then you become a JavaScript programmer.
Now Scott is showing some JavaScript - 2 lines of JavaScript code - 1 line of HTML to get a map. The bare-bones code will give you a map w/o controls - and you can programmatically add them. There's an API for adding overlays, and you can even control if the drop-shadow for the popup shows up. I wasn't aware the popup window shows a shadow on the underlying map - that's pretty cool. Finally, you can respond to user events by adding event listeners to the different user actions.
Google now has competition - MSN Virtual Earth now called Windows Live Live. There is a website devoted to using Microsoft's API, but looking at Scott's example - the Google API is much cleaner. Interestingly enough, it also works best in IE.
If you want to learn more about this stuff, I encourage you to checkout Scott's Google Maps API book.
In the "Main Event", Scott talked about a whirlwind of open-source testing tools, including: JUnit, Cobertura, Groovy, EasyMock, HttpUnit, DbUnit and JunitPerf.
Before this talk, I'd never heard of Cobertura, but I have used jcoverage and EMMA
. After seeing talk, and seeing that Mike Clark endorses Corbertura, I should probably check out. However, since there's already a AppFuse + EMMA HowTo, maybe I should just stick with that.
The main benefit of unit testing is it facilitates change. Unit tests are a safety net. It simplifies integration - if you "trust" the individual pieces, then using them in the larger picture is much easier. Unit testing makes your code "self-documenting" - rather than relying on Word documentations that get stale immediately, unit tests are always an up-to-date reflection of the code base. Enforces a loosely-coupled architecture - because your code is being consumed by two different entities (unit tests and application), best practices naturally occur. For example, coding to interfaces instead of implementations.
At this point, I decided to stop taking notes. Thanks for the great talks Scott!
Ted Husted gives a nice review of what's happening at Apache with Struts and Roller. Pretty cool to see him offering Struts Training as well. I wonder if most Struts 1.x users will upgrade to Struts Action 2.0 (a.k.a. WebWork)?