Application modernization. Everyone is talking about it. You can't page through IT magazines, surf application development Web sites, or even peruse your inbox without being bombarded with messages on why your applications are out of date and what perils await if you don't modernize as soon as possible. It's easy for us, as application developers, to place the blame squarely on the limitations of our legacy applications when problems arise, especially if we didn't author parts of the source code. It's hard for us, as application developers, to look in the mirror and question whether the limitations of our development environment are partially to blame as well. An outdated legacy development environment can stop a company from streamlining its business and achieving its potential just as easily as outdated legacy applications.
Using an outdated development environment produces inadequacies that are often overlooked, misconstrued, or simply swept under the rug. For example, companies have greater difficulty finding and recruiting new application developers to maintain their legacy systems, application source code becomes more restrictive and less maintainable for their current staff, and developer productivity continues to decrease each year they continue to use their outdated development environment. These problems are typically contained within IT and rarely become visible outside of IT until one of two situations happens:
- Situation One: IT cannot resolve a critical business requirement because of the limitations within the development environment. This situation has both a positive and a negative side. On the negative side, the situation reflects poorly on IT, not resolving the requirement has financial implications for the company, and you are now under extreme pressure to quickly resolve the issue. On the positive side, getting approval and funding for a new development environment becomes easier because you can justify the purchase and its return on investment. Unfortunately, the negative side definitely outweighs the positive side.
- Situation Two: IT is proactive and acknowledges the need for a new development environment before situation one happens. For companies to meet the demands of their business, programmers need to spend all of their bandwidth improving or fixing applications. Developers have a finite amount of bandwidth that can be spent on application development, so every minute spent struggling with limitations, creating workarounds, or learning new languages is time not spent addressing their company's needs. The positive side of this situation is that the forward thinking reflects positively on IT and you are not under extreme pressure to make a quick decision. But on the negative side, approval and funding gets harder unless you can bundle it with a critical business issue, like an upcoming project or technology requirement.
Regardless of which situation finally moves your company to upgrade or replace its development environment, be aware that many critical items require careful consideration during the selection process.
Caution! Think Before You Leap
Selecting a new development environment is no easy task; it's a complex decision that will have an immediate and lasting effect on your company. Knowing what to look for in a development environment is almost as important as the decision itself. The most important steps of any technology selection process are to understand the current state of your enterprise, gather your technology requirements, and establish your long-term development needs. Skipping these steps means that companies end up repeatedly purchasing small niche tools that satisfy a single technology requirement. Before you know it, the accumulation of all these different niche tools spirals out of control as the number of skill sets required to know increases and elevates the complexity of application development to an unmanageable level. This article assumes that you've already gone through the due diligence of requirement gathering and focuses on identifying what the key features of a productive development environment are and why they are important to you.
One of the most important features to put on your evaluation checklist is the availability of a high-level language (HLL), sometimes referred to as a 4GL. According to the Aberdeen Group, companies looking to modernize their applications need to investigate high-power software generators that provide excellent programmer productivity while generating the target platform's native Java or .NET low-level code.1 Software generators, not to be confused with code generators, provide a single development environment and an HLL to generate all the underlying native 3GL code for your applications. Since HLLs can generate anywhere from 20 to 40 lines of native 3GL code per single line of HLL code, programs that used to be 30,000 lines can now be written in 1,000 lines. Developers maintain the source code produced by code generators whereas developers rarely see the generated code produced by a software generator.
Generally, 3GL languages by nature are not productive languages because they are very granular and developers usually need to know more than one 3GL language to be productive at their position. In fact, Java.net, Sun's Web site dedicated to Java developers, asked developers, "How many programming languages do you use in addition to Java?" Roughly 50 percent of the Java developers said they use one or two additional languages, 20 percent said they use three to five additional languages, and 5 percent said they use more than five additional languages besides Java. Added together, that means 75 percent of Java developers need to use at least one additional 3GL language to get their work done. Remember, time spent learning additional languages and technologies is time not being spent on application development.
Another benefit of HLLs is that they tend to generate native application source code for multiple interfaces and multiple server platforms, which is important because it's rare to find homogenous System i organizations these days. Even by 2003, approximately 75 percent of OS/400 shops had Windows servers of some sort running at their sites. In the past, developing for heterogeneous computing environments meant possessing separate development camps that specialized in a particular technology. With the adoption of a multi-platform HLL, this expensive issue can go to the wayside.
Using a high-level language is just one way to increase programmer productivity; many other features can aid application development as well. One example is repository-based development environments because they maximize code reuse. In an article recently posted on IBM’'s developerWorks Web site (IBM's resource for developers), Gilles Dodinet states that you can "boost development effort through repository-driven engineering" and that "data-intensive applications lend themselves to a repository approach." But buyer beware; some solutions claim to have a repository when in actuality all they have is data dictionary. A true repository plays a much more active role in application development than a data dictionary, thus the term "active repository."
So what differentiates an active repository from a data dictionary? An active repository stores the same metadata found in a data dictionary, but also contains the data validation capacity of a rules-based engine and provides many more features like field visualization, field-level help text, and multi-lingual support to name a few. Once business rules and validation logic have been associated to the fields and file definitions stored in the repository, the business rules are automatically exposed to every program regardless of interface or platform and without having to write any code to enforce the validations. Under the covers, there are essentially two deployment strategies for repository-based business rules: Some repository solutions create external programs that encapsulate the validation logic into reusable compiled objects while other repository solutions simply insert the business logic right into the application source code at assembly and compile time. Even though both deployment models will greatly reduce the amount of time maintaining applications, the compiled object solution is better because only one object needs to be recompiled when a business rule is changed. The solution that inserts the validation logic into the source code requires every program that uses the rule to be recompiled.
Pay close attention to the developer workbench and how it interacts with the active repository. Powerful IDEs empower developers to accomplish more in less time. Begin by verifying that the IDE has change management capabilities that enable developers to check out source code, lock the source code while they make their edits, and then check the code back in to the change management system for versioning and rollout. Joined to the hip of change management is the ability to support both team and offline development. These features permit developers to program without being connected to the change management system and take care of automatically broadcasting all changes to the developers who need to know about them. The shortest distance between two points is a straight line, so make sure the IDE provides a holistic view of your environment. All related objects must be no more than a mouse click away from each other. Last but not least, pay close attention to the source editor since this is where developers spend most of their day. Good source editors are intuitive and easy to learn. Look for features like intellisense (auto-complete), prompting for command parameters, and inline command-level documentation that speeds up development and the process of finding information.
The global economy runs on legacy systems—both the software and hardware—that represent hundreds of billions of dollars in investments that enterprises have made over decades. Interoperability is inescapable in today's ecosystem of multiple platforms, multiple interfaces, and multiple databases running within an organization these days. In addition, communicating between trading partners is an essential part of streamlining business processes, and integration with third-party technologies is available to deliver functionality you don't have time to build yourself. Unfortunately, application integration is a major IT headache and takes up about 40 percent of the typical IT budget, according to recent Aberdeen research.
How can companies address this IT headache? Use a development environment that easily enables the communication of common data formats via common transmission protocols. Highly productive development environments offer business process management (BPM) technology that allows companies to use drag-and-drop workflow wizards to address their integration requirements, shielding developers from having to know how to work with the standard transmission protocols (HTTP, FTP, SMTP, etc.) and map data in and out of common file formats (XML, CSV, TSV, etc). Ideally, integration with disparate platforms and applications should be done via Web services.
SOA may not be part of your short term strategy, but Web services must be. According to Mike Gilpin at Forrester Research, "Web services technology—standards-based Internet middleware—promises to deliver more flexible integration more easily across more internal applications and external partners. Many firms stand to benefit by implementing Web services." The ability to create and consume is important, but being able to create them quickly and easily is key. Aberdeen recommends that companies "consider using an automated solution vendor to complete this process quickly, and with less risk and less cost than a manual conversion."
Recently, interoperability has become a hot topic because of the increasing popularity of composite applications. According to Aberdeen, "composite applications contain logic and data collected from multiple IT sources and harnessed with Web services standards such as XML, SOAP, and WS-*. These applications are rapidly becoming the development standard of choice in all IT organizations." A development environment is worth its weight in gold if it can shield your developers from having to learn the Web services languages and standards like SOAP, XML, UDDI, and WSDL.
A development environment's deployment architecture speaks volumes in regards to its flexibility and adaptability. Today, well-architected applications are designed as n-tier because that allows each tier to be isolated for scaling, performance, and reuse. With n-tier architectures, additional features or changes to one tier can be updated without modifying the other tiers and redeploying the whole application. For example, a company could change the application's interface (presentation tier) and databases (data tier) without updating or redeploying the business logic and application logic tiers.
Most modern-day 3GL development environments do allow companies to write n-tier applications. But developers are required to have an in-depth understanding of this architecture and manually build and link all those tiers together, which is not a trivial task. Bipin Joshi, a Microsoft MVP (ASP.NET) and a member of ASPInsiders, says, "The downside of n-tier architecture is that you need to create many isolated classes and pieces of software. However, benefits of n-tier applications will far outweigh its disadvantage." Ideally, you should select a development environment that, under the covers, creates n-tier applications for you. When the development environment takes care of creating and linking the tiers together, developers can focus on business issues instead of architecture and design.
Flexibility and Adaptability Are Key
Remember, the future performance of your company relies heavily on the flexibility and the adaptability of its core systems. Your development environment plays an extremely large role in the viability of these applications. Retooling with a development environment without a high-level language, an active repository, interoperability, and a flexible architecture may put you into the same situation all over again—and sooner than you think. To ensure this doesn't happen to your company, here are a few guidelines to jump start your development needs checklist:
- Make sure the development environment provides an HLL that generates native 3GL for all the platforms and interfaces your company requires, and don't forget RPG and 5250! Ideally, you want to use only one language for all application development.
- Verify that you never have to leave the development environment for editing or debugging purposes. In essence, make sure you're purchasing a true development environment, not just a code generator.
- Does the solution have an active repository for maximizing code reuse and reducing application maintenance? If so, make sure it encapsulates the business rules into external objects in lieu of inserting the rules into the application source code.
- Ask if application wizards or templates are provided, which help increase development time and enforce company standards.
- Confirm that the source editor has all of today's modern features and is set up for team development.
- Does the architecture support n-tier? More importantly, make sure that most of the tiers are automatically created so developers do not have to know the intricate details of object-oriented development and n-tier architecture.
- Verify that the development environment you're about to adopt has a long history of staying abreast of technology and insulating developers from technology shifts. This is the only way your application can flex and adapt to changing business requirements without increasing the size of your development staff and the number of skill sets.