The new technologies available in Web 2.0 make it easy to combine multiple applications into a single interface.
In the business world, though, none of these cool technologies make sense unless they contribute to the bottom line, and they can't do that without data from your own application. In this article, I'll separate the overall hype about mashup technology from the realities of creating integrated business applications. For example, in the greater Web world, a movement is afoot proclaiming the advent of the "semantic Web." While the idea has perhaps more relevancy today than it did back when Tim Berners-Lee (the inventor of the WWW) coined it nearly a decade ago, the idea of a magic cloud where all data is accessible and easy to integrate still doesn't exist today.
Yes, APIs exist--especially within the realm of social software, the Flickrs and MySpaces and YouTubes of the world--that make it very easy to link pictures and videos from these Web sites into your Web pages. And this makes sense; since such services exist only at the whim of their users, they must be as open to the world as possible. But is there really a lot of information freely available on the Web that is going to make sense integrated with your applications? Do you really want your business to depend on something that was posted on YouTube?
At the most, you might want to use those applications to get some free storage and bandwidth for pictures and videos, but in the long run, is that really a good idea? If you really want to offload your data, you have to think carefully about whether you want it on Flickr, where it is protected by no contractual obligations, or on a real corporate service such as Akamai, which has been doing exactly this sort of thing for a long time.
Mashups in the Corporate World
So let's skip the whole idea of making the content from your employee's MySpace pages available on your corporate Web site and instead focus on adding value to your business. Mashups can do that in one of three ways: linking multiple applications (combining data from two or more existing corporate sources), adding external formatting (using an external source to format your corporate data), or adding completely new external data.
Linking Multiple Applications
First, you can link together data from multiple applications. As you begin your foray into the world of Web presence, this is perhaps one of the biggest benefits of mashups technology, because it allows you to integrate different Web-enabling strategies. While normally it's all but impossible for a JavaServer Page (JSP) and a PHP-generated page to interact, by using the magic of AJAX, it's actually relatively easy to take data from the JSP, for example, and use it to invoke a PHP page. Why is this important? Because it allows you to pursue multiple technologies. That way, even if your strategic direction is JavaServer Faces (JSF), you can still take advantage of a canned PHP application or an RPG-CGI script.
The way you do that is to segment your application into individual components, each with its own interface. Then, you use data returned by one component of the application to generate the URLs used to invoke other components. You might have an RPG-CGI program that generates a list of hot orders but uses a URL to a quick and dirty PHP script to get the phone number for the salesman on that order.
Because all of the servers are under the control of your IT department, this often avoids one of the most vexing issues in the mashup world: cross-domain requests. A cross-domain request is one in which the primary page is loaded from one server, but it makes requests for services from a server in another domain. I'll discuss the cross-domain issues in more detail later, but they exist in all the rich-client implementations, including Microsoft's Silverlight and Adobe's Flex.
Adding External Formatting
This is an interesting opportunity to spice up your applications. I like to think of it as a transition option, much like screen scraping. With external formatting, your application sends data to an external server, which processes the data and returns it in another format. While you can write the formatting server yourself, what really makes this exciting is the existence of the Google Chart API, which is actually a simple URL-based Web service. You provide the data in the URL, and Google returns a formatted business graphic like the one below.
While the service is not perfect, what this allows you to do is to embed business graphics into your applications almost immediately, without locking you into a specific architecture long-term. I stress that because using a service has some negatives. Foremost, of course, is the fact that you're sending your data out to an outside service; this potentially raises both security and performance concerns. The other issue is cost. While the service is free for now, Google can change that at any time.
However, this does give you a quick and dirty way to introduce the concept of business graphics to your Web audience to see its applicability and to gauge whether it makes sense to pursue the technology. Once you've done that, literally dozens of other options exist for embedding locally generated graphics into your application, thereby bypassing the issues raised above. However, each of those options has its own issues; using an external service like this buys you the time to do the due diligence required to select the one most appropriate for your business.
Adding External Data
This option gets the most air time, I think, because it has a high "Wow!" factor, especially among the bloggers and twitterers. And don't get me wrong: I don't deny the ubiquitous nature of the Web's social networking attributes. I just want this article to draw a very distinct line between social software and business software. This is not to say that the two never merge, but instead to say business decisions based on social networking phenomena tend not to be the best decisions. Don't believe me? Ask some of the major players how much money they've lost advertising in Second Life.
So, when I talk about adding external data, I want to focus specifically on the data that adds value to a business application, as opposed to gadgets that make an application more socially friendly, like a "Word of the Day" icon.
To me, it's hard to identify external data that makes sense to add. The single biggest one is of course Google Maps, which is why that particular capability is so popular. Every business can make good use of an icon that pops up a map with directions to a specific location. Any brick-and-mortar store needs a good store finder; the term "store locator" is pretty much a common part of today's online lexicon. And of course certain businesses, such as real estate, can no longer compete without this sort of technology.
But really, it's hard for me to identify external data that might actually be important to my users but that my company doesn't provide. Maybe you know of some situations that I'm just not seeing; I'd love to have an ongoing discussion about this particular topic because I think it's fundamental to understanding the requirements of the next generation of business applications.
And while I don't see a lot of generic Web services that make sense for every business, that doesn't mean these external services have no use for any business. As you get to specific businesses or job descriptions, you can probably find some services that provide some benefit. Your road warriors, for example, need icons that can give them updated weather information and flight status. These services can come from free providers like EJSE or from for-fee services such as FlightStats (technically, FlightStats has both free and fee services). My biggest concern with using any of these services is that you must design your application to handle the situation when that service is not available.
A Technical Caveat
The final caveat to this whole discussion is a highly technical one. Both the problem and its potential solutions exist way down in the bowels of the technical details of HTTP transactions, but they need to be discussed. The generic term for the problem is "cross-domain requests," and it has to do with the idea of running an application on one domain (the domain being the part that immediately follows "http://" in a URL, such as http://www.mcpressonline.com/ or http://www.plutabrothers.com/) that in turn requests data from another domain.
In the simpler world of pure HTML, this wasn't much of an issue; even if you did ask for data from another server, you would typically just be downloading images. Images rarely caused problems, and in any case, HTML-based cross-domain requests are allowed by browsers. But one of the primary restrictions to AJAX is that the underlying infrastructure (namely the XMLHTTPRequest function) has hard limitations in most browsers to disable cross-domain execution. I don't understand the exact reasoning behind this, but I think it's because the data from such a request is almost always used to change the contents of the page, and such changes could allow evildoers to cause your page to do very bad things, like stealing your contacts list, trashing your hard drive, or even turning your computer into a mindless zombie that follows whatever orders its nefarious master sent.
Sorry, got a little carried away there. But it's not really a stretch. The state of browser software is such that nearly any decent hacker can throw together a script that can attack and obliterate an improperly secured computer. The trick is how to secure the computer. One option is to disallow any access to the outside world. This is the realm of the firewall and is the reason that most cross-domain requests are disallowed by today's browsers. In fact, if you try to send an AJAX request to a server other than your own, it will not succeed.
This is not just an AJAX issue, though. Any of the thick-client environments, such as Silverlight and Flash, have their own security issues and their own ways of dealing with them. The usual option is through a type of whitelist, in which the developers can identify trusted sites whose content can be accessed safely. This policy file, of course, is a single point of security risk and thus needs tight controls, the type of controls that make developers crazy when trying to develop new applications.
Of course, the entire idea of a proprietary thick-client environment on the browser is another one that needs to be addressed by your corporation, and I don't intend to go down that particular road in this article. Leaving aside the question of whether proprietary tokenization causes problems for open Web semantics, two other methods exist for getting around the cross-domain problem.
This idea has actually been around since the beginning of the Web, certainly since corporate workstations have needed (limited) access to the Internet. A server proxy is a device that basically intercepts all requests from the clients on your network and processes them itself. It sends the data to another machine, receives the response, and then sends it back to the client.
While in theory the concept is very simple and very powerful, there are a number of real issues, especially when dealing with persistent connections. But if you can overcome the technical hurdles, server proxies provide a wide range of benefits above and beyond cross-domain requests, including caching (increased performance) and auditing (enhanced security). The downside is that each request goes through another layer, and that can reduce performance. So implementing a server proxy correctly is a little more complex than just sticking another machine on the network, and thus that puts it squarely into the business decision category, requiring the appropriate research.
Dynamic SCRIPT Requests
I bring this up because it has become a hot new way of getting around the cross-domain script issue. I say "new" because it's getting a bit of press these days, although the idea of script injection is itself not new and goes back several years already.
Remember that I said that standard HTML elements like image tags () don't have the same cross-domain limitations as the XMLHTTPRequest, which underlies AJAX. That's because the browser just doesn't check those sorts of things. Well, that same "feature" is available for the tag, which thus allows you to execute a script on any machine in the world. As large a hole as that might seem, it's been relatively obscure for some time now, but with the growing use of AJAX (and the growing developer angst over the cross-domain problem), the dynamic SCRIPT request is seeing more usage.
Just imagine the security problems that have been posed by SQL injection, and then realize that this is worse: it's actually executing arbitrary programs on your workstation. I can see auditors clutching their chests and dropping like flies....
That's it for mashups. I think they're a very positive force in today's world. I'm secretly thrilled that mashups don't require the heavy infrastructure of portal technology, because they allow you to very quickly prototype your next generation of applications. And even though there are some caveats, they're not insurmountable, so now it's up to you to figure out how to best apply this technology to your bottom line.