Over the past year, publicly available government information has been appearing all over the Web'on Google Maps.When the search giant posted its own mapping Web service, it did something unusual. It published the application programming interfaces'the code that delivers maps, pointers and associated features'to the Web page. In effect, Google was offering its mapping program to third-party Web sites and programs. Creative Web designers could use the maps to locate relevant public information with no expensive geographic information system needed.One individual, for instance, mapped global earthquake readings from the Geological Survey. Adrian Holovaty, a Chicago-based Internet developer, created , which uses police data feeds to map where crime is happening in the city, parsing the info by ZIP code, ATMs, gas stations and any number of other ways (Holovaty also works for Washingtonpost.Newsweek Interactive, a division of the Washington Post Co., which owns GCN). The raw information gets downloaded from an online database query service set up by the Chicago Police Department's Records Division.The Google Map fad is the most visible artifact of a larger trend toward tying together different programs and data sources over a network in order to build entirely new applications. These 'mash-ups,' as they're informally called, often make better use of existing data and can even replace expensive software altogether.The Silicon Valley digerati call this Web 2.0, referring to how the Web can serve up not only pages, but actual programming components. And it's become a perfect example of service-oriented architecture in action, which seeks to define applications by loosely coupled business services that can be reused for different functions.Kim Nelson, former CIO of the Environmental Protection Agency and executive director of e-government for Microsoft Federal, sees the merits. 'I think anybody would find it useful in terms of looking at various components and making decisions about what they could avoid building in the future because certain components already exist,' she said.Tom Owad, a developer in York, Pa., created the 'Finding Subversives with Amazon Wishlists' Web site by tying together a number of services on the Web, including Google Maps. Although he studied computer science in college, 'this project was not a difficult one at all,' he told GCN in an e-mail. 'I know a number of high school students who would have had no trouble pulling it off.'Although he completed the project to show how much information an intelligence agency could gather through public sources, it was also a great example of how such information might be harvested and processed without expensive commercial software. Here's how it works:The Amazon.com site offers pages of people's wish lists'CDs, books and so forth. Owad found that it would be possible to download all 260,000 wish lists using a simple Web page fetching command. He then came up with a list of tongue-in-cheek terms and book titles that might typify an extremist's list ('Build Your Own Laser, Phaser, Ion Ray Gun and Other Working Space Age Projects,' 'Torah,' 'Greenpeace'). He then wrote a script to capture wish lists with such select terms. Since every wish list page includes the city and state of its creator, Owad could map these individuals on Google Maps.Google only uses latitude-and-longitude coordinates for its maps. So Owad pulled in another Web tool, called the Ontok Geocoder service, to convert city and state information into latitude-and-longitude coordinates. Ontok uses the public domain TIGER/Line data available from the Census Bureau to find those coordinates.In no time, Owad had created a tool that could begin to pinpoint people with certain interests based on publicly available, freely disclosed information.While Owad showed how easy it was to create a data-mining tool using services over the Web, his work only hints at the sorts of applications that could be assembled over a network, given an agreed-upon set of standards.The Organization for the Advancement of Structured Information Standards aims to set those Web services standards. The Web services stack that OASIS is shepherding enables different software components (services) to talk with one another.This approach could let agencies assemble an application'say an invoice system'from existing components, such as a currency converter in one department and a tax calculation module in another department.Although the phrase 'Web services' is not synonymous with service-oriented architecture, using a Web services framework sets the stage for easily sharing components. In fact, thinking of your ap- plications as services that can be loosely coupled together is the first step in developing an SOA. To build the overall framework, you need to:Building an enterprise SOA for composite applications is decidedly more involved than a Google Maps mash-up, but the benefits are slowly becoming more apparent.One of the advantages is that once software components are wrapped up as Web services, they can be easily represented in a modeling program. 'You can visually model the interaction between a number of services,' said Dave Chappell, vice president and chief technology evangelist for Sonic Software Corp. of Bedford, Mass.'With the right tools, you could do this without a lot of programming,' agreed Keith Hurwitz, Microsoft Federal platform strategy adviser. 'In the past, business logic and workflow lived in the code. People can design workflows that meet their requirements and take that out of the hands of application programming and make it more part of the work they do every day.'Alan Gibson, product marketing manager for Telelogic AB of Malmo, Sweden, demonstrated how to build a composite application using the company's modeling tool, System Architect. In System Architect, each service is represented by an icon, to which a developer adds low-level communication details, such as formats and schemas for particular data sets.After the developer does the prep work, a domain expert pulls the services from a palette and connects them with lines that represent workflow. Once the application is finished, the analyst clicks the 'Generate BPEL' command, which creates a script, written in the Business Process Execution Language, which is called upon when someone runs the app.The benefits of composite applications could prove valuable in years to come. The process streamlines what is now an inexact science of specialized application development, said John DeVadoss, who leads the architecture strategy team for Microsoft Corp. Today, agencies may specify what a system should do, but developers might not have a full understanding. This inevitably means developers must go back and change the software. Sometimes the back-and-forth becomes so bad that the project gets shelved.The beauty of an SOA modeling approach is that end users and domain experts can design the workflows themselves, so it's much clearer how a new application should run, said Jan Popkin, senior vice president at Telelogic.In theory at least, there's no limit to the scale of a composite application. You could even build an enterprise resource planning system from scratch. But when you reach a certain point of complexity, it becomes more efficient to purchase a prepackaged application, according to Dennis Nadler, chief technology officer for federal integrator Merlin Technical Solutions of Greenwood Village, Colo.Nonetheless, enterprise software vendors are taking notice. Oracle Corp., for instance, is basing its next-generation ERP applications on its Fusion middleware, which consists of an application server, business process server, rules engine and the developer tools, all of which will be based on industry standards, said Ted Farrell, vice president and chief architect.Other ERP vendors, such as SAP AG and Salesforce.com, offer fine-grained Web services interfaces to their products, allowing customers to hook into these services with other programs. 'We do see applications decomposing,' said DeVadoss. 'The vendors are still working through what that means to their visions, but it's composition, driven by the end user, that will be the focal point.'Another advantage to the SOA/composite application approach is that it could help agencies avoid being locked in to one vendor's software package. Most orchestration packages save user-generated workflows in any of a number of plain text- or XML-based languages, such as BPEL, the Business Process Modeling Notation or the eXtensible Business Reporting Language.All these standards are open, meaning, in theory anyway, that you could take a BPEL workflow created in Oracle and run it with Software AG's workflow engine. This is a major change from the days when an application's configuration was locked into the application itself. If you changed ERP packages, you'd have to reconfigure the new package from scratch.Separating the application from the logic also helps organizations change processes more easily, should their own business processes change.'Who owns the intellectual property? Do you want the logic to reside in the application itself, or at a higher architectural layer?' said Popkin. 'The intellectual property [should be] part of the organization.'In many ways, the CIO Council predicted the rise of composite applications in a July 2004 white paper titled 'Service Component-Based Architectures.' The paper advocated component-based architectures.'A component should clearly separate the definition of the services that it provides from the implementation of the services,' the paper concluded. In other words, a user of a service should not have to know how that program works. The component should be defined using industry standards.Service components include anything from a complete business line, such as the payroll service offered by Pay.gov, to a simple run-time application that executes one function and returns the results. The committee recommended defining services at a high level, so agencies could quantify returns on investment in their business cases and line up with business services outlined in the Federal Enterprise Architecture's service reference model., run by the Treasury Department, is an example of a unique, cleanly defined service.It acts as a go-between bridging financial institutions and agencies that collect fees. Rather than each agency negotiating fees with credit card companies, they can use Pay.gov to act as collection agent. An agency can get payments via its own service, but then pass the transaction details to Pay.gov, which will collect from the credit card company or bank.'You send us the file, we execute the transaction and the money gets deposited. It's that simple,' said Russ Kuehn, program manager for the project.The composite application approach is also starting to show up in many agency pilot programs. In February, the CIO Council's Semantic Interoperability working group held a conference that demonstrated a number of pilot programs using this SOA approach.Rohit Agarwal, chief technology officer of Digital Harbor of Reston, Va., showed off a program that tries to detect patterns of money laundering and other fraud by monitoring multiple sources and collating the data through three different operations. Such an application could be of interest to the CIA and other intelligence or law enforcement organizations.'The problem with fraud is that it doesn't occur through one channel,' Agarwal said. 'This is basically a composite problem. We're focused more on the relations of the data rather than the data itself, because that is where the pattern is.'
ChicagoCrime.orgHigh schoolers could do itThinking ahead- Locate what services exist on the network that may be of use to additional users
- Write adapters to wrap these services, so they will be available in a consistent fashion
- Deploy a registry so users and programs can find the components
- Deploy a common communications platform, so the components can function with one another
- Put a process or business rules engine in place to run the composite applications when the user invokes them.
Lots of possibilitiesEarly visionPay.gov