Going 5250 on the Information Superhighway

General
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

So you want to take your business for a ride on the information superhighway? Do you know where the right on-ramp is? Do you know what the best route is? I’m sorry to say this, my friends, but you will need more than MapQuest.com to help you chart your course for this journey. You will need to determine which technology, which development approach, and which deployment method are best suited to your business needs and to your company’s budget.

Some will “throw the baby out with the bath water” by totally scrapping their AS/400s for an off-the-shelf, (allegedly) Windows NT-based solution. They’ll purchase one without accurately factoring in the long-term costs required to support it, knowing it will not be as robust as their legacy midrange system. (Try calculating your company’s aggregated productivity losses attributable to rebooting your network servers versus IPLing your AS/400.)

Others will think, “What about our legacy green-screen applications? Don’t our users like them?” Speaking of users, what about human nature? You know your users will resist learning a totally new system, and can you even justify the expense of a new off-the- shelf system? Have you considered the effort involved in configuring and deploying a new system? Can you retrofit into your new system all the modifications you have made to your legacy system over the years?

Don’t Worry, Be Happy

As it turns out, you can “Webify” your legacy green-screen applications, while preserving your existing business logic, with a minimum of code intrusion. Here is a case study of how I accomplished just such a task on a client project where I was technical leader of the AS/400 group.

First, the project was divided into two teams. The RPG team made the code insertions on the AS/400 applications. The Visual Basic (VB) team developed the new desktop GUI front-end and interfaced it with the AS/400 through a system-to-system messaging product called Cold Fusion. (The techniques we used will work just as well with other messaging products.)

Second, the technical approach was decided. The client identified the applications to be Web-enabled, choosing a few menus to allow field personnel to access its already


familiar order entry system better. The client decided to use Client Access for PC-to- AS/400 connectivity.

These new Web jobs would initiate and run in batch, using the programs originally written for interactive environments. Each affected program would identify the jobs to determine their origin, with native jobs resulting in normal screen I/O operations. Web jobs, on the other hand, would have to send and receive the same screen information through a set of data queues. All display files involved would be mapped to new data structures to facilitate the passing of data between systems via Cold Fusion.

It was agreed that the VB front-end would do none of its own business logic processing; it would only populate the PC screens. (One exception was table look-ups. Many of the AS/400 tables, such as customer number, were loaded onto the client machine.) All other processing would take place on the AS/400, using already developed, tried-and-true business logic.

Finally, we deployed the solution. As I mentioned, this required no change to existing business logic in the RPG programs. The client’s like-functioning programs were very standardized in that they came from the same templates with the same subroutine names and structures. Consequently, it was easy to insert our standardized subroutines into all of the programs, using a cookie-cutter approach but with surgical precision.

The Nuts and Bolts

Every Web job would first put a record containing job and user information into a single data queue. This queue would be under constant surveillance by a never-ending program on the AS/400 that, upon receipt of a valid queue entry, would create a pair of data queues for the Web job: one for inbound data transfers and another for outbound data transfers. Since the local data area (LDA) wouldn’t be used in these particular applications, we decided to use it to identify Web jobs by placing a code in it.

In their initialization routines, these programs would first determine the job’s origin by looking at the LDA for this code. Because these Web jobs would be running in batch, because batch jobs can’t open display files, and because these display file fields would be used throughout the programs, we had to initialize all such numeric fields upon opening the program, to protect against decimal data errors at execution. As shown in Figure 1 (page
101), we set up some of our constants concerning data queue parameters in the same initialization routine. We then let the program and all its existing logic take over as if nothing had changed.

At the point where the program would normally display a file format, the added identifying logic would either perform normal screen I/O operations (if the job were running natively) or send/receive a screen’s worth of data through the pair of data queues (if the job were being executed from the Web). As shown in Figure 2 (page 101), in preparation for the latter case, each screen field being used throughout the program would have to be moved into the data queue. For each screen format, two additional subroutines were appended to the program: one to move fields from the program and display file to the data queue and another to move data from the data queue to the program fields. Each of these subroutines would then call the Send to Data Queue (QSNDDTAQ) and Receive from Data Queue (QRCVDTAQ) APIs.

We also built program termination controls into the program so that, if communications were disrupted, the job wouldn’t just “hang” there, looking at a data queue for a response that would never appear. We set a timeout threshold of 60 minutes, after which time the program would exit gracefully.

For testing, we had to build some of our own tools. The RPG team did not always work in concert with the VB team, so we created a screen that acted as a client in that it would read the outbound data queue and write to the inbound data queue. This allowed the RPG team to see exactly what we were sending to the VB side without the VB programs even being written for any particular function.


Always Room for Improvement

In retrospect, there are some lessons to learn from this project. First and probably foremost, the RPG and VB teams should have been closer to each other. (We were in different regional offices.) That would have greatly enhanced our development and testing efforts. Second, Cold Fusion may not have been as effective as MQSeries might have been. (See “Get the Message with MQSeries” elsewhere in this issue.) As it turns out, the response time could have been greatly reduced.

.

. (pre-existing business logic)
.

* Retrieve *LDA (in init routine)
C *DTAARA Define *LDA LDA
C In LDA

*

C If (SourceSys 'W')
C Open DSPFILE
C Else
C MoveL '*LIBL' DQLib
C Z-Add 3600 DQTimeOut
.

. (any other data queue and/or web specific initialization)
.

C EndIf
.

. (more pre-existing business logic)
.

* Move screen data to data queue and send the data queue
C SndSFLctl BegSR
C Move *Blanks DQData

* Move screen/pgm fields to data queue
C Move SFLRRN SFRRN2
C Move OrdNum OrdNu2
C MoveL Fmt2 DQData
C MoveL DtaQout DQName
C Call 'QSNDDTAQ'
C Parm DQName
C Parm DQLib
C Parm DQLength
C Parm DQData
.

. (more pre-existing business logic)
.

C RcvSFLctl BEGSR
C Move *Blanks DQData
C MoveL DtaQin DQName
C Call 'QRCVDTAQ'
C Parm DQName
C Parm DQLib
C Parm DQLength
C Parm DQData
C Parm DQTimeOut

* Move data queue fields to screen/pgm
C MoveL SFRRN2 SFLRRN
C MoveL OrdNu2 OrdNum

* *DTAQ timed-out
C If (DQData = *Blank)
. (Handle your data queue timeout action)
C EndIf
.

. (more pre-existing business logic)
.

Figure 1: Business logic is not affected by changes, allowing the program to run either natively or from a GUI client.

* Sample of screen I/O conditioning logic

*

* SFLctl -- format name of the SFL control record for

* native displays

* SndSFLCtl and RcvSFLCtl -- the subroutines to move

* screen/program data into the send data queue output


* parameter and to move the receive data queue input

* into the screen/program fields, respectively, and

* calls the appropriate SND/RCVDTQ API.
.

. (pre-existing business logic)
.

C If SourceSys = 'W'
C Exsr SndSFLCtl
C Exsr RcvSFLCtl
C Else
C Write SFLctl
C Read SFLctl
C EndIf
.

. (pre-existing business logic)
.

To load the subfile was very simple. SndSFL moves screen/program data into the send data queue output and
calls the SNDDTQ API.

.

. (pre-existing business logic)
.

C If SourceSys = 'W'
C Exsr SndSFL
C Else
C Write SFLrcd
C EndIf
.

. (pre-existing business logic)
.

Figure 2: I/O routines are modified to use either subfiles or data queues.


So you want to take your business for a ride on the information superhighway? Do you know where the right on-ramp is? Do you know what the best route is? I’m sorry to say this, my friends, but you will need more than MapQuest.com to help you chart your course for this journey. You will need to determine which technology, which development approach, and which deployment method are best suited to your business needs and to your company’s budget.

Some will “throw the baby out with the bath water” by totally scrapping their AS/400s for an off-the-shelf, (allegedly) Windows NT-based solution. They’ll purchase one without accurately factoring in the long-term costs required to support it, knowing it will not be as robust as their legacy midrange system. (Try calculating your company’s aggregated productivity losses attributable to rebooting your network servers versus IPLing your AS/400.)

Others will think, “What about our legacy green-screen applications? Don’t our users like them?” Speaking of users, what about human nature? You know your users will resist learning a totally new system, and can you even justify the expense of a new off-the- shelf system? Have you considered the effort involved in configuring and deploying a new system? Can you retrofit into your new system all the modifications you have made to your legacy system over the years?

Don’t Worry, Be Happy

As it turns out, you can “Webify” your legacy green-screen applications, while preserving your existing business logic, with a minimum of code intrusion. Here is a case study of how I accomplished just such a task on a client project where I was technical leader of the AS/400 group.

First, the project was divided into two teams. The RPG team made the code insertions on the AS/400 applications. The Visual Basic (VB) team developed the new desktop GUI front-end and interfaced it with the AS/400 through a system-to-system messaging product called Cold Fusion. (The techniques we used will work just as well with other messaging products.)

Second, the technical approach was decided. The client identified the applications to be Web-enabled, choosing a few menus to allow field personnel to access its already


familiar order entry system better. The client decided to use Client Access for PC-to- AS/400 connectivity.

These new Web jobs would initiate and run in batch, using the programs originally written for interactive environments. Each affected program would identify the jobs to determine their origin, with native jobs resulting in normal screen I/O operations. Web jobs, on the other hand, would have to send and receive the same screen information through a set of data queues. All display files involved would be mapped to new data structures to facilitate the passing of data between systems via Cold Fusion.

It was agreed that the VB front-end would do none of its own business logic processing; it would only populate the PC screens. (One exception was table look-ups. Many of the AS/400 tables, such as customer number, were loaded onto the client machine.) All other processing would take place on the AS/400, using already developed, tried-and-true business logic.

Finally, we deployed the solution. As I mentioned, this required no change to existing business logic in the RPG programs. The client’s like-functioning programs were very standardized in that they came from the same templates with the same subroutine names and structures. Consequently, it was easy to insert our standardized subroutines into all of the programs, using a cookie-cutter approach but with surgical precision.

The Nuts and Bolts

Every Web job would first put a record containing job and user information into a single data queue. This queue would be under constant surveillance by a never-ending program on the AS/400 that, upon receipt of a valid queue entry, would create a pair of data queues for the Web job: one for inbound data transfers and another for outbound data transfers. Since the local data area (LDA) wouldn’t be used in these particular applications, we decided to use it to identify Web jobs by placing a code in it.

In their initialization routines, these programs would first determine the job’s origin by looking at the LDA for this code. Because these Web jobs would be running in batch, because batch jobs can’t open display files, and because these display file fields would be used throughout the programs, we had to initialize all such numeric fields upon opening the program, to protect against decimal data errors at execution. As shown in Figure 1 (page
101), we set up some of our constants concerning data queue parameters in the same initialization routine. We then let the program and all its existing logic take over as if nothing had changed.

At the point where the program would normally display a file format, the added identifying logic would either perform normal screen I/O operations (if the job were running natively) or send/receive a screen’s worth of data through the pair of data queues (if the job were being executed from the Web). As shown in Figure 2 (page 101), in preparation for the latter case, each screen field being used throughout the program would have to be moved into the data queue. For each screen format, two additional subroutines were appended to the program: one to move fields from the program and display file to the data queue and another to move data from the data queue to the program fields. Each of these subroutines would then call the Send to Data Queue (QSNDDTAQ) and Receive from Data Queue (QRCVDTAQ) APIs.

We also built program termination controls into the program so that, if communications were disrupted, the job wouldn’t just “hang” there, looking at a data queue for a response that would never appear. We set a timeout threshold of 60 minutes, after which time the program would exit gracefully.

For testing, we had to build some of our own tools. The RPG team did not always work in concert with the VB team, so we created a screen that acted as a client in that it would read the outbound data queue and write to the inbound data queue. This allowed the RPG team to see exactly what we were sending to the VB side without the VB programs even being written for any particular function.


Always Room for Improvement

In retrospect, there are some lessons to learn from this project. First and probably foremost, the RPG and VB teams should have been closer to each other. (We were in different regional offices.) That would have greatly enhanced our development and testing efforts. Second, Cold Fusion may not have been as effective as MQSeries might have been. (See “Get the Message with MQSeries” elsewhere in this issue.) As it turns out, the response time could have been greatly reduced.

.

. (pre-existing business logic)
.

* Retrieve *LDA (in init routine)
C *DTAARA Define *LDA LDA
C In LDA

*

C If (SourceSys 'W')
C Open DSPFILE
C Else
C MoveL '*LIBL' DQLib
C Z-Add 3600 DQTimeOut
.

. (any other data queue and/or web specific initialization)
.

C EndIf
.

. (more pre-existing business logic)
.

* Move screen data to data queue and send the data queue
C SndSFLctl BegSR
C Move *Blanks DQData

* Move screen/pgm fields to data queue
C Move SFLRRN SFRRN2
C Move OrdNum OrdNu2
C MoveL Fmt2 DQData
C MoveL DtaQout DQName
C Call 'QSNDDTAQ'
C Parm DQName
C Parm DQLib
C Parm DQLength
C Parm DQData
.

. (more pre-existing business logic)
.

C RcvSFLctl BEGSR
C Move *Blanks DQData
C MoveL DtaQin DQName
C Call 'QRCVDTAQ'
C Parm DQName
C Parm DQLib
C Parm DQLength
C Parm DQData
C Parm DQTimeOut

* Move data queue fields to screen/pgm
C MoveL SFRRN2 SFLRRN
C MoveL OrdNu2 OrdNum

* *DTAQ timed-out
C If (DQData = *Blank)
. (Handle your data queue timeout action)
C EndIf
.

. (more pre-existing business logic)
.

Figure 1: Business logic is not affected by changes, allowing the program to run either natively or from a GUI client.

* Sample of screen I/O conditioning logic

*

* SFLctl -- format name of the SFL control record for

* native displays

* SndSFLCtl and RcvSFLCtl -- the subroutines to move

* screen/program data into the send data queue output


* parameter and to move the receive data queue input

* into the screen/program fields, respectively, and

* calls the appropriate SND/RCVDTQ API.
.

. (pre-existing business logic)
.

C If SourceSys = 'W'
C Exsr SndSFLCtl
C Exsr RcvSFLCtl
C Else
C Write SFLctl
C Read SFLctl
C EndIf
.

. (pre-existing business logic)
.

To load the subfile was very simple. SndSFL moves screen/program data into the send data queue output and
calls the SNDDTQ API.

.

. (pre-existing business logic)
.

C If SourceSys = 'W'
C Exsr SndSFL
C Else
C Write SFLrcd
C EndIf
.

. (pre-existing business logic)
.

Figure 2: I/O routines are modified to use either subfiles or data queues.


BLOG COMMENTS POWERED BY DISQUS