09
Mon, Dec
6 New Articles

Weaving WebSphere: The Object of Persistence--JDBC, JDO, and EJB

Development Tools
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times
"By object is meant some element in the complex whole that is defined in abstraction from the whole of which it is a distinction."
--John Dewey

A little heavier of a quote than usual, eh? I've reached back a century to one of the great original thinkers of the period, John Dewey, professor emeritus at Columbia University and founder of the school of pragmatism. Dewey was discussing something a bit more basic than databases; he was talking about the basic concept of how we as humans think. But even so, his rather pithy definition is important to the subject of this column--namely, the persistence of data objects.

In this column, I'm going to introduce Java Database Connectivity (JDBC), Enterprise JavaBeans (EJBs) and the newest player on the Java data persistence stage, Java Data Objects (JDOs). I'll compare and contrast them, and I'll comment about their relevance to today's applications.

The Problem--Database vs. Objects

What we're really talking about here is the ability to map the concept of an object (a "distinct element") to the relational database (the "complex whole" upon which all enterprise applications are built).

The database is a relatively simple concept: Fields are grouped into files, and files have records, each record containing a unique set of those fields. Records are usually related to one another through keys. Note, however, that I said "relatively" simple--some of the concepts of database design, such as normalization, can be very tricky. And frankly, most legacy databases have a feature or two that don't lend themselves very nicely to the more rigorous requirements of an object model. For example, in some databases, data is soft-defined within the records. This plays havoc not only on objects, but on SQL in general.

But even with a nice, normalized database, there are issues when you try to map that data to an object model. For example, what exactly is an "Item"? Is it the fields found in the Item Master file? What if you have warehouse-level overrides? Prices? Deals and promotions? Sales history? And what about the inventory for the item? Also, there are times when you need to access data in objects in a particular sequence or subset--for example, all the orders for a given item. While easy enough to do with a relational database, it's not quite as simple in an object environment.

It is this dichotomy between objects and relational data that has been the most persistent hurdle to using object-oriented languages in the development of robust enterprise applications.

The Contenders--JDBC and EJB

Those of you who know me and have read my various publications and postings--including my last column, "Weaving WebSphere: JTOpen--The Right Tools for the Job"--know very well that I am a big fan of the Record Level Access (RLA) classes available from the JTOpen project. However, RLA is not really appropriate for this part of the discussion; I'll return to it a bit later. For now, I'd like to talk about the top two non-RLA contenders, JDBC and EJB.

JDBC

JDBC is the very thin Java API that sits directly on top of the SQL call-level interface (CLI). With JDBC, you directly access the data in your relational database using standard SQL syntax. You retrieve data via queries that return result sets--collections of rows that contain the same columns. Because JDBC is so tightly coupled to the very powerful capabilities of SQL, you can perform incredibly complex queries on your data and retrieve large amounts of information very quickly. For example:

SELECT ITEM, SUM(ONHAND*LISTPRICE) FROM ITEMMASTER, INVENTORY WHERE ITEM = INVITEM GROUP BY ITEM


Even without knowing a lot about the database, you might still be able to intuit that this statement will calculate the inventory valuation at list price of all items in inventory. Of course, this is a very simple example and doesn't reflect the complexities of a real enterprise database, but it shows that JDBC, via its SQL syntax, can perform very powerful data mining functions.

This same flexibility also applies to set-level updates of databases--the kind typically performed at month-end to reset opening balances or to create summary files. However, JDBC is not quite as good when it comes to dealing with individual records like we typically do in business applications. (RLA is really the winner here, but leave that for now.)

No matter what, JDBC is heavily tied to the database and doesn't deal with objects. It's really just a direct map of the database. This tight relationship is both good and bad news. The good news is that it's relatively easy for RPG or COBOL programmers, especially those with SQL experience, to learn to use JDBC. The bad news is that these programmers tend to think in terms of databases; they rarely think in terms of objects. JDBC is often interspersed throughout business applications, wherever programmers find that they need a bit of information--similar to adding a CHAIN to a file. I've even seen JDBC calls directly embedded into JavaServer Pages (JSPs), which is pretty much the direct opposite of trying to separate business logic and user interface. This tends to make the application highly dependent on the database and makes it very difficult to implement database changes.

EJB

On the other end of the spectrum is the EJB architecture. The EJB concept is object-centric, in that everything is built around the object, which in EJB terminology is called a bean. The original architectural purpose was to allow an application on one machine to use objects on another machine, thereby replacing and removing the complexity of Common Object Request Broker Architecture (CORBA). Rather than write the code required to build an object, the programmer simply calls a factory method to create a bean and has no idea where it comes from (or where it's going if the application updates it). At some point, the designers recognized that this concept could also be applied to the database. The same level of indirection that hides the creation of objects from the application programmer could also be used to hide the database interface.

EJBs support a variety of different types of beans, but the one we're interested in for this discussion is the entity bean. An entity bean is, in Sun's own terms, tied directly to a file: "Typically, each entity bean has an underlying table in a relational database, and each instance of the bean corresponds to a row in that table."

Entity beans have a very sophisticated multi-tiered architecture. For example, for each business object class, you must define the entity bean, an interface for the business methods for an individual bean (such as updating it or deleting it), and a home interface that deals with all the beans as a set (such as creating a new bean or finding a bean).

The complexity doesn't stop there, however. There are two flavors of entity beans, based on the way that persistence is handled. The two types of persistence are bean-managed and container-managed. With bean-managed persistence (EJB BMP), you basically must write the code to store and retrieve data from the database, while with container-managed persistence (EJB CMP), this is all managed by the container mechanism. The interfaces for the two types are completely different, and selecting one or the other has huge ramifications for your application. In short, bean-managed persistence allows you control over the database mapping and thus is better suited to legacy databases, while container-managed persistence is more portable but gives complete control of the database to the container and thus is more appropriate for new applications.

I won't be shy here: I think that EJBs are over-engineered. Object-oriented languages have always treated relational data as an oversight, and in their attempt to use a single architecture to support both distributed processing and database interface, the EJB designers have created an architecture that supports neither particularly well. To use EJB BMP, you basically have to write all the database logic yourself, thereby adding all the baggage of EJB. With EJB CMP, you lose any control over the database, a situation that is totally unsuitable for legacy databases. I suppose you could write your own container management code, but then you'd be writing the EJB BMP logic as well as the additional code required for the CMP architecture.

The New Kid--JDO

So you find yourself facing a dilemma. You can use JDBC to write your code, which can lead to spaghetti code and can cause you to spend a lot of time tuning your JDBC connections in order to get good performance, or you can use EJB. Of course, if you have legacy databases and you use EJB, you end up writing all that code in JDBC anyway, except you have to include all the overhead of EJB.

Neither of these approaches answers the fundamental question: How do you map a relational database to an object architecture? As previously stated, JDBC is too close to the database, and EJB is too far from it. Is there some middle ground?

JDO

Well, at first glance, Java Data Objects, or JDO, technology looks like it might fill that role. A relatively new technology (the specification was just released in March 2002), JDO is essentially a mapping technology that attempts to define the relationship between an object class and the underlying relational data. The JDO mapping is specified by an XML document (what isn't these days?) that identifies the characteristics of the fields in the database that are used to populate the attributes of the class.

An example of a simple JDO definition is provided below:

























This definition identifies two classes: the Employee class and the Department class. The Employee class has four fields: name, salary, dept, and boss. The Department class has two fields: name and emps. The emps field is defined as a collection of Employee objects. This is quite a bit of code to define what amounts to a trivial database that could be implemented using only a single file (there is no intrinsic data associated with the "Department" class; it's simply a collection of Employees with the same department name). This XML document contains what amounts to a completely new language that must be written and maintained in addition to other application code.

This might not be an insurmountable problem; the idea of a database administrator isn't exactly new. However, there is one glaring flaw with the JDO language: There is no reference to any existing relational database. That's because the JDO definitions are actually used to build the database schema for you after you've defined the relationships. Once again, the idea of interacting with an existing legacy database is ignored by the Java designers; instead of using the data you already have, they'll build a new database for you. Oh joy.

In a further departure from normal application development, JDO doesn't even generate source code for you. Instead, you write the source for your class and compile it, and then you run a "class enhancer," which uses the XML document to change the compiled code. At this point, your source code doesn't even match your compiled class (which must be very interesting for source-level debugging tools).

Summary

JDO is another example of the rather insular worldview of the Java developers that, in my opinion, keeps Java from becoming a major player in business application development. Instead of focusing on ways to interface the object architecture with existing legacy data and the millions of programmer-hours already spent capturing and refining business rules, Java instead seems intent on rebuilding the entire world from scratch. Every time a Java technology comes along that deals with persistence and relational data, it seems to completely ignore the existence of legacy databases. This implies that the Java developers have limited experience in business application development, which bolsters my assertion that, at least for now, Java is better suited for middleware and that RPG or COBOL is more appropriate for business logic.

So for existing legacy databases, JDO is currently a bust. The concept of JDO isn't all bad; some of the implementation details, such as extents (scrollable sets of objects that are automatically replenished from the database on demand), are actually quite useful. JDO could be a step in the right direction if the designers decide to add syntax that will support existing data. In my opinion, they'll also need to rethink the idea of directly modifying class files; it's simply too much to ask IDE designers to support classes that no longer match their source.

As for EJB, it doesn't support the basic functions we require to solve business problems. For example, there's no concept of getting a collection of otherwise unrelated beans by some common sequence. This is my yardstick in business application design, and it's a requirement for one of the most basic functions of an ERP system: the manufacturing resource planning (MRP) requirements generation. In an MRP generation, you need to be able to roll through all the various supply and demand data by date, regardless of the "object type." Allocations, forecasts, customer orders, and shop orders all need to be handled together in date sequence in order to generate requirements. This can be handled quite gracefully in DB2/400 by using a logical over multiple physical files, but it would require that you implement a fairly sophisticated matching record algorithm on top of the EJB logic.

Conclusions

All that being said, I think that at this time we're still stuck with implementing our own business logic under the covers. That doesn't mean we can't use a consistent architecture that hides the database implementation from the application programmer. Indeed, I think that if Java is ever to be used for enterprise-level application programming, we're going to have to design just such a framework. The concept of the JDO technology isn't completely unfounded--having a single, precisely defined mapping language to relate relational data to business objects is probably the right idea. It's just a matter of making sure that the mapping language is flexible enough to support existing data, as well as the more complex data relationships required by business applications.

And by hiding the database implementation from the application, you would be able to use whichever data access technique was best suited for the environment. It is quite possible to design an interface that defines a set of "object data primitives" that would then be implemented transparently to the application programmer. Data access could be done via JDBC or RLA, whichever was appropriate (I told you I'd get back to this subject!). Or you could even call an RPG program to do the I/O; it would be entirely up to the developer implementing the persistence logic.

I'd be interested to see what a group of application developers would come up with if they were given the task of defining a set of these object data primitives. By distilling the knowledge gained from writing thousands of application programs over the years, I think such a group would be able to come up with a powerful set of requirements that could be used to design the next generation of object-relational data mapping.

I'd love to hear your thoughts on the idea. Please comment here in the MC Press Forums, or feel free to send me email. Perhaps we'll revisit this subject in another column.

Joe Pluta is the founder and chief architect of Pluta Brothers Design, Inc. He has been working in the field since the late 1970s and has made a career of extending the IBM midrange, starting back in the days of the IBM System/3. Joe has used WebSphere extensively, especially as the base for PSC/400, the only product that can move your legacy systems to the Web using simple green-screen commands. He is also the author of the popular MC Press book E-deployment: The Fastest Path to the Web. You can reach Joe at This email address is being protected from spambots. You need JavaScript enabled to view it..

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: