Wed, Oct
2 New Articles

Turbocharging ODBC for Client/Server Performance

  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

ODBC is the Java of the database world and is the preferred method of implementing client/server on the AS/400. If you are careful, you can write an ODBC client application that is completely back-end independent. With ActiveX Data Objects and tools like Visual Basic, ODBC can easily be used as a high-speed PC data conduit to and from the AS/400.

Open Database Connectivity (ODBC) to the AS/400 has been much maligned as of late. I suspect this disgruntlement is due mostly to poor programming techniques or the widespread use of tools that don’t take advantage of ODBC performance features. In my shop, we use ODBC in our production systems. Every day, we process over 2.5 million SQL requests and insert over 40,000 new records into a database of more than 25 million records. That database is maintained and serviced exclusively via ODBC. It works.

So, why does ODBC have a bad reputation? In my experience, the following things degrade ODBC performance and should be avoided:

• Excessive communications between the client and server
• Using Data Access Objects (DAO)
• Using Microsoft Access attached tables If you want to use ODBC, the preferred method is ActiveX Data Objects (ADO) from a client tool like Visual Basic (VB), Visual C++, or another tool/language that supports OLE. For general performance guidelines, you’ll also want to do the following:

• Minimize communications
• Minimize transaction span (keep locks to the minimum)
• Avoid sorts
• Construct queries that utilize indexes
• Place business logic on server
• Reuse query access plans

Some of these issues are controlled through ODBC while others are programming and query design issues. In this article, I’ll discuss some performance techniques you can apply, but I’ll leave the query optimization up to you.

First, Get Your Toolbox in Order

The examples I’ll show you use VB code running ODBC inside an ADO framework. In order for these techniques to function correctly, you need to activate the following VB project references in these programs: the Microsoft ActiveX Data Objects 2.0 library and the Microsoft ActiveX Data Objects Recordset 2.0 library.

Project references are activated in Visual Basic 5.0 by clicking Project from the VB drop-down menu and then References. Find these entries and place a check mark next to them. If they are missing from your VB references, download and install the Microsoft Data Access Software Development Kit (SDK) and the Microsoft Data Access Components Version 2.0 from the Microsoft Universal Data Access Web site at http://www.microsoft.com/data/download.htm. This download fixes problems with the earlier releases, adds connection pooling, and provides better parameter object support. I don’t recommend running this code without the new components.

Now, You’re in a Bind

Static SQL statements are a performance problem because, each time they are executed, the query optimizer has to build an execution plan. If you can create statements that utilize bind variables, the AS/400 can store these statements in a reusable “package” that can improve performance. To do this, you tell OS/400 to analyze the statement the first time it is prepared, create an execution plan, and then store both the statement and execution plan in a physical file (called a package file). Each time your application runs that statement, the prestored execution plan is used, which really cuts down on processor utilization.

Let’s say you have a client screen that accepts an employee ID and you want to display the employee’s name when the user exits the ID field. You just concatenate the ID into an SQL statement like this:


The AS/400 will examine the statement and find the appropriate access path to retrieve the data; then, it will execute the statement. The next time someone types in a different ID, the AS/400 will see this as a different statement (unless you are finding the same employee) and will repeat the optimization process. Now, consider this statement:


The question mark (?) is called a parameter marker, signifying that, at execution, the ? will be bound to a value. The query optimizer will again determine the best access plan for this statement and store the statement and execution plan in an AS/400 package file. However, the next time your application uses the statement, the AS/400 locates the preconstructed package (using a simple string comparison) and uses the stored execution plan. Because calculation of access paths takes time, you have just saved that time for all subsequent executions of this statement.

ADO makes using bind parameters easy. To execute a statement with bind parameters, you perform the following three steps:

1. Prepare the statement via a Command object.
2. Associate the parameter(s) with the value(s) you want to bind. In ADO, the parameters are contained in a parameters collection in the Command object. They are bound to the statement when that object is executed.

3. Execute the statement via the Command object’s Execute method. Figure 1 shows a Visual Basic code snippet that executes the entire process previously outlined. To ensure that you are using packages on your AS/400, check the Package(s) page of your Client Access ODBC configuration and ensure that the “Enable extended dynamic (package) support” option is checked, as shown in Figure 2. It is a good idea to turn package support off while you are developing an application because the SQL

statements may change many times during development. Turn package support on for fielded applications to improve their performance.

Extended Fetch

Extended fetch is an ODBC feature that most shops need. It allows you to retrieve multiple rows as a memory array, with one network call to the AS/400.

When calling SQLFetch, a packet is sent to the AS/400 requesting a record, and the AS/400 sends an Acknowledgement (ACK) packet to the PC, letting it know that the request was received. The AS/400 constructs a packet containing the next available record and transmits the record to the client PC. The client then sends another ACK packet to the AS/400 to acknowledge that the data was received. Pretty standard communications stuff, right? But it gets more complicated.

Now, say you need to export 50 rows of data from an AS/400 table. If you use SQLFetch, you will cause 200 ACK packets to be sent and your program will be waiting on the latency of your network. Now, consider fetching 50 rows with SQLExtendedFetch.

SQLExtendedFetch can request any number of rows from the AS/400. For this example, assume that you prepare your buffers to receive 50 records at a time. Once SQLExtendedFetch is called, the AS/400 places the next 50 rows of your result set into a packet stream and sends the stream to your PC. The PC then sends a single ACK packet to tell the AS/400 that 50 rows were received. The SQLExtendedFetch paradigm creates far less network traffic and is much more efficient. In fact, I can retrieve over 2,500 records per second on most of my export queries. This contrasts with 40 records per second using SQLFetch.

Extended fetch can be used through ADO by setting the CacheSize property of the recordset object. The VB code snippet in Figure 3 shows an ADO session that will retrieve all of the tables on your AS/400 by querying the SYSTABLES view in the QSYS2 library. When I wrote this code, I tried setting the CacheSize property to different values until I achieved the best query performance. I found that, on my system, a CacheSize of 1 returned 350 records per second, while a CacheSize of 100 returned over 4,500 records per second for this query.

Note that by using ADO, you do not have to worry about calling SQLExtendedFetch to fill your local record buffer. ADO automatically populates the local buffer with the first X rows (where X is the current setting of the CacheSize parameter) when you execute the MoveFirst method against the recordset. Once you have used MoveNext to move past the last record in the buffer, ADO fetches the next X records into the buffer. By using ADO, you divorce yourself from the intricacies of managing local record buffers.

One thing to note about extended fetch performance is the isolation level of the client application. Isolation levels control how the AS/400 locks the records that you have selected. I recommend that for browse-type queries (which is where this method is mostly used) you append the For Read Only text to the end of your SQL statement and use a Forward Only-type recordset (as shown in Figure 3). For Read Only will cause no locks to be thrown on the data you retrieve, and a Forward Only-type recordset does not support the MovePrevious method that can adversely affect performance.

For extended fetch to work properly, you should also ensure that the “Enable pre- fetch during execute” checkbox is not checked on the ODBC Performance tab of your Client Access ODBC Data Source. If this option is specified, you will not receive the first set of rows from your request but instead will receive the second set of rows, as if you had called SQLExtendedFetch twice before looking at the buffer! This is a for-real-oh-my-gosh bug in V3R7 and above.

Stored Procedures

A stored procedure can be any program or command that resides on the AS/400. Stored procedures are important because they allow you to encapsulate program logic onto the AS/400 instead of writing that logic on the client. This capability can greatly enhance

program performance. Here is an example of how I converted a convoluted client process into a simple stored-procedure call and how this conversion impacted my performance.

In my company’s data-entry application, the operator fills out a screen and then initiates a posting subroutine. The posting routine consists of seven SQL select statements to verify various fields and data relationships on the screen. It does some numeric checks on different field values, and then it performs an SQL INSERT statement to place the data on the AS/400. This routine was about 60 lines of VB code and took a full second of client execution time. Note that because of the seven SQL statements, we are making seven round trips to the AS/400 during execution of the subroutine.

Using the ILE C compiler and the SQL precompiler, we created a stored procedure called POSTJOB. The procedure accepts all of the fields from the data-entry screen as input parameters. POSTJOB is about 70 lines of code and performs all of the checks previously done on the client. However, because the SQL statements are now executing on the AS/400, we greatly reduce the network communications between the client and server. The client calls the procedure, and the procedure informs the client as to whether the record was accepted or not. This conversation executes in one tenth of a second.

By creating the procedure, we gained the following advantages:
• We can insert and verify 10 records per second versus one record per second. Although my operators cannot enter that many records per second, it reduces my AS/400 processor load and provides additional capacity.

• Our PC client code is reduced from 60 lines of code to six lines.
• The procedure is now available to any other program that needs to perform this
• Logic changes occur in one program and are instantly propagated to all clients using this procedure with no change to the client code (as long as the procedure arguments stay constant).

Procedures are good for three things: encapsulating business logic on the server, reducing network communications utilization, and increasing performance capacity.

Calling All Commands

Another side effect of AS/400 stored procedures is that any AS/400 command or program can be called as a procedure, even if it is not declared as a procedure! Figure 4 contains a snippet that calls the Start Debug (STRDBG) command for your job. STRDBG causes the AS/400 to write all query optimization decisions to your job log. If you call STRDBG and then execute the SQL statements that your application uses, you can view the job log to see what indexes or optimization methods DB/2 chose to implement for each SQL statement. This can be an invaluable technique for diagnosing server-side performance problems or just a neat way to expand your knowledge of the query optimizer.

The STRDBG program is also valuable because you can use it as a template for executing other VB programs that execute AS/400 programs or commands. Simply change the command text statement and you’re off to the races.

That mechanism for calling an AS/400 command is the Command Execute (QCMDEXC) command, which is in the QSYS library. QCMDEXC takes two arguments:
(1) command string, and (2) command string length as a decimal (10,5) number. Note that the example code in Figure 4 calls the MkRunCmd function. This function, shown in Figure 5, takes an AS/400 command string and returns a properly formatted call to the QSYS.QCMDEXC function. I find this code invaluable because I got tired of counting the number of characters in my string and creating a decimal (10,5) number for each command I wanted to call. Pass in the following string:


When you do, the MkRunCmd function passes back the following:


This is the properly formatted call to notify OS/400 to start debugging. As I mentioned above, it can also be modified to start any AS/400 program.


That’s a Wrap!

ODBC is a powerful, production-oriented tool that you can use in AS/400 programming. You should be aware that, even if you do not own SQL or Query on your AS/400, ODBC connectivity is still available since all AS/400 systems ship with support for DB/2 installed. So, with the addition of an ODBC driver like Client Access, or one from another vendor, you can be off to the SQL races without owning the SQL development kit or Query! ODBC is powerful, and that power provides additional capabilities for your organization. Enjoy!

Getting the Box Score from Your Stored Procedure Results

I always code AS/400 stored procedures with two extra arguments appended to the end of the argument list. The first is always an integer, and the second is a char(80). I use these arguments to pass back execution information from the called procedure. If the integer argument is 0, my program knows that the procedure executed correctly. If it is a non-zero number, the error text will be in the 80 character variable. So, my client-side logic is really simple: If argument X is not 0, then display argument Y to the operator and perform a rollback; otherwise, issue a commit. This way, the calling program does not need to know anything about how the procedure can fail. It just needs to display the error text information returned by the procedure.

However, this technique leads to an important stored-procedure question: Where do you put COMMIT and ROLLBACK statements? Since performing commit and rollback is how we maintain transactional control via SQL, you should never place commitment control statements inside a stored procedure. If the AS/400 controls commitment control, your client programs have no control over transactional integrity.

Consider the following as an example:

Your client program inserts a record and calls a procedure that causes a commit. It then looks at another record and decides to rollback. The inserted record may or may not be rolled back, depending upon whether the AS/400 procedure program was created to keep the same activation group as the calling program. If the procedure is in a different activation group, the rollback will not undo the previous insert. If the procedure is in the same activation group, the commit will commit the previous insert and your rollback will do nothing. Besides that, even if the rollback takes out the insert, it still did not undo the effects of the procedures.

This scenario exhibits some spooky, unpredictable behavior (and really bad application design, IMHO). You need to keep the commitment control on the client-application side. If you do, you won’t have to guess what is happening in your application.

Public Con1 As New ADODB.Connection
Private Sub Command1_Click()
Dim Cmd1 As New Command
Dim RS As New Recordset
'Associate the connection with the command object
Cmd1.ActiveConnection = Con1
'Set the CommandText property to the statement you want to execute
SQLStr = "select system_table_name from qsys2.systables"
SQLStr = SQLStr & " where system_table_schema = ? and"
SQLStr = SQLStr & " system_table_name like ?"
Cmd1.CommandText = SQLStr
'Now set the first parameter to the name of a library you want to
'display table names from

Cmd1.Parameters(0).Value = "QGPL"
'Now set the second parameter to only show table names that start
'with a particular value (such as the letter Q)

Cmd1.Parameters(1).Value = "Q%"
'Now we execute the command and return a result set

Set RS = Cmd1.Execute
'Insert code here to process your Record Set
'Clean up after yourself Set RS = Nothing

Set Cmd1 = Nothing
End Sub
Private Sub Form_Load()
'Activate the connection with the AS/400
'you will need to replase datasource, userid and password
'with a vaild ODBC datasource, userid and password
Con1.Open "datasource", "userid", "password"

End Sub
Private Sub Form_Unload(Cancel As Integer)
'Release the connection to the AS/400
Set Con1 = Nothing
End Sub

Figure 1: VB code to re-create parameter markers to improve query performance


Figure 2: To use preconstructed packages for your VB SQL statements, checkmark the “Enable extended dynamic support” option in your ODBC datasource

<>< mce_serialized="2"><>
<>< mce_serialized="2"><>
<>< mce_serialized="2"><>

Private Sub Command2_Click()
Dim RS As New Recordset
'Make an sql statement
SQLStr = "select system_table_schema, system_table_name from qsys2.systables For Read Only"
'Open the resultset
RS.Open SQLStr, Con1, adOpenForwardOnly, adLockReadOnly
'Set the cachesize
RS.CacheSize = 100
'Insert code here to process your Record Set
'Clean up after yourself Set RS = Nothing
End Sub

Figure 3: Setting the CacheSize parameter on your SQL statement automatically enables SQLExtendedFetch for your query

Private Sub Command5_Click()
Dim Cmd1 As New Command
'Associate the connection with the command object
Cmd1.ActiveConnection = Con1

'Place the call command into the commandtext property
Cmd1.CommandText = MkRunCmd("STRDBG UPDPROD(*YES)")
'Call the execute method
'You job is now in debug mode, all optimizer messages will be written to the job log
Set Cmd1 = Nothing
End Sub

Figure 4: This simple program can be used as a template, executing other AS/400 programs from within VB

Function MkRunCmd(CMDStr) As Variant
'Find the length of the string
CMDStrLen = Len(CMDStr)
'Find the string length of the number CMDStrLen
CMDNumberLen = Len(CStr(CMDStrLen))
'Parse together the call command
OutStr = "call qsys.qcmdexc('" & CMDStr & "',"
OutStr = OutStr & Mid("0000000000", 1, 10 - CMDNumberLen)
OutStr = OutStr & CStr(CMDStrLen) & ".00000" & ")"
'Return the call command to the calling program
MkRunCmd = OutStr
End Function

Figure 5: This function returns a properly formatted call to QSYS.QCMDEXEC



Support MC Press Online

$0.00 Raised:

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: