16
Tue, Apr
5 New Articles

TechTip: DB2 for i Optimization Strategies, Part 5

Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Learn how to take an ax to common suboptimal SQL programming practices.

This series is almost done, yet there are so many illustrations of potential code improvements to offer to the DB2 for i development community. This tip discusses possible performance improvements when replacing legacy cursors with set operations and when using temporary tables. See the references at the end of this tip for links to prior tips in this series.

 

Avoid Unnecessary Cursors

A cursor is a control structure that allows database code to process a result set in a row-by-row fashion. However, database engines including DB2 are optimized to process data in sets rather than row by row. In most cases, processing data row by row with SQL should be avoided.

 

In the early years of DB2 for i SQL, there were many operations that required row-by-row processing, However, as the language improved with new features, the need to use cursors to accomplish a task diminished greatly.

 

To illustrate, consider a temporary table that stores a schedule of important events:

 

DECLARE GLOBAL TEMPORARY TABLE EVENT_SCHEDULE (

EVENT_ID INT,

EVENT_STAMP TIMESTAMP NOT NULL,

EVENT_DUR INT,

EVENT_DESC VARCHAR(50),

EVENT_INFO VARCHAR(2048))

WITH REPLACE

;

 

For the sake of example, say a stored procedure is required to populate this temporary table with this information. Thereafter, the EVENT_ID column should be assigned as a sequential number based on the sorting of the EVENT_STAMP column. Further, it is required that the EVENT_ID is reset to one when a new year is encountered. Here’s a procedure that does this work:

 

DECLARE @ROW_ID INT NOT NULL DEFAULT 0;

DECLARE @END_OF_DATA CHAR(1) DEFAULT 'N';

DECLARE @PRV_YEAR INT NOT NULL DEFAULT 0;

DECLARE @EVENT_ID INT;

DECLARE @EVENT_STAMP TIMESTAMP;

 

DECLARE TEMP CURSOR FOR

SELECT EVENT_ID,EVENT_STAMP

FROM SESSION.EVENT_SCHEDULE

ORDER BY EVENT_STAMP

FOR UPDATE OF EVENT_ID

;

DECLARE CONTINUE HANDLER FOR NOT FOUND

   SET @END_OF_DATA='Y';

 

-- Populate data in temporary table

INSERT INTO SESSION.EVENT_SCHEDULE

SELECT 0 AS EVENT_ID,

       SHOW_TIME,

       SHOW_DUR,

       SHOW_DESC,

       SHOW_INFO

FROM SHOW_SCHEDULE

WHERE CATEGORY='FAMILY'

ORDER BY SHOW_TIME;

 

-- Assign a sequential event id to each row

OPEN TEMP;

FETCH TEMP INTO @EVENT_ID,@EVENT_STAMP;

WHILE @END_OF_DATA='N' DO

   IF YEAR(@EVENT_STAMP)<>@PRV_YEAR THEN

       SET @PRV_YEAR=YEAR(@EVENT_STAMP);

       SET @ROW_ID=0;

   END IF;

   SET @ROW_ID=@ROW_ID+1;

 

   UPDATE SESSION.EVENT_SCHEDULE

       SET EVENT_ID=@ROW_ID>

     WHERE CURRENT OF TEMP;

EVENT_ID=@ROW_ID</span">

 

   FETCH TEMP INTO @EVENT_ID,@EVENT_STAMP;

END WHILE;

CLOSE TEMP;

 

The cursor in the example does nothing except read the temporary table and assign a sequential number to the EVENT_ID column, with the EVENT_ID being reset with each new calendar year encountered. However, since V5R4, there has been an easy way to accomplish this same task without a cursor by using the ROW_NUMBER function:

 

INSERT INTO SESSION.EVENT_SCHEDULE

SELECT ROW_NUMBER()

       OVER(PARTITION BY YEAR(SHOW_TIME)

           ORDER BY SHOW_TIME) AS EVENT_ID,

       SHOW_TIME,

       SHOW_DUR,

       SHOW_DESC,

       SHOW_INFO

FROM SHOW_SCHEDULE

WHERE CATEGORY='FAMILY'

ORDER BY SHOW_TIME;

 

Here, the ROW_NUMBER function does the same work that the cursor loop accomplished: It provides a sequential row number for the EVENT_ID column. The PARTITION BY causes the numbering sequence to be reset at the start of each new year. All of the work is done with the INSERT statement, and the looping logic and single row updates associated with the cursor are eliminated.

 

There are still some cases where a cursor is useful, such as calling a stored procedure for each row in a result set or bringing data into a COBOL program for legacy report generation. However, it’s still a good practice to review legacy cursor code to see if there are cursors that are eligible to be refactored as a set operation.

 

 

Incidentally, in IBM i 7.3, there are several new OLAP functions and features that allow developers to do marvelous things, such as create running total columns, calculate averages or percentages on a subset of rows within the result set, and look back or peek ahead to other row values within the result set. All of these features greatly reduce the number of circumstances where cursor logic is required.

 

Keep Temporary Tables Small and Filter, Filter, Filter

One important principle of using temporary tables is to keep their content to a minimum both in terms of row count (filtering) and row width. Row width refers to the average number of bytes it takes to store the data for a single row, and this factor depends on the number of columns and selected data types in the table definition,

 

What developers often forget is that the temporary table is supposed to give DB2 a break by giving it a small scratchpad to work with. However, if a developer builds a huge temporary table, this work area for DB2 is going to hog plenty of resources and offset any performance benefit that might be had by using it rather than directly accessing the application’s tables.

 

Let me illustrate one way that I see this principal ignored. In many reporting scenarios, a stored procedure is written to do complex data gathering and calculations for a report. The generic code flow is as follows:

  1. Query application tables and insert required data into temporary table.

  2. Perform complex operations on temporary table and update it with results.

  3. Return results to caller (based primarily on the temporary table’s content) for report rendering.

 

Say you’re tasked with writing an ugly report that tries to calculate the profit on items sold. For purchased items, there’s logic that attempts to correlate a sales order line amount with an averaged purchase order cost (or perhaps a more complicated calculation such as a FIFO cost). For manufactured items, there’s logic that attempts to correlate the sales order line amount with the cost of the work order(s) that built the item(s). In the end, the cost is deducted from the sales amount to determine the profit.

 

The first step in this process is to gather the data into a temporary table similar to the statement shown below:

 

DECLARE GLOBAL TEMPORARY TABLE SalesData

AS (

SELECT sod.SalesOrderId,sod.SalesOrderDetailId,sod.CarrierTrackingNumber,

       sod.OrderQty,sod.ProductId,sod.UnitPrice,

       soh.PurchaseOrderNumber,soh.AccountNumber,soh.CustomerId,

       soh.SalesPersonId,soh.TerritoryId,

       st.Name AS StoreName,

       t.Name AS TerritoryName,t.Group AS TerritoryGroup,

       p.Name As ProductName,p.ProductNumber,p.ListPrice,p.Color,

       CAST(NULL AS INT) AS NumberWorkOrders,

       CAST(NULL AS INT) AS NumberScrapItems,

       CAST(NULL AS DEC(19,4)) AS ScrapCost,

       CAST(NULL AS INT) AS LaborHours,

       CAST(NULL AS DEC(19,4)) AS AvgLaborCost,

       CAST(NULL AS DEC(19,4)) AS AvgMaterialCost,

       CAST(NULL AS INT) AS FirstWorkOrderId,

       CAST(NULL AS INT) AS LastWorkOrderId,

       CAST(NULL AS TIMESTAMP) AS FirstWorkOrderStartDate,

       CAST(NULL AS TIMESTAMP) AS LastWorkOrderStartDate,

       CAST(NULL AS INT) AS NoDaysToBuild,

       CAST(NULL AS DEC(38,4)) AS AvgPOCost,

       CAST(NULL AS INT) AS FirstPurchaseOrderId,

       CAST(NULL AS INT) AS LastPurchaseOrderId,

       CAST(NULL AS DATE) AS FirstPurchaseOrderDate,

       CAST(NULL AS DATE) AS LastPurchaseOrderDate

FROM SalesOrderDetail sod

JOIN SalesOrderHeader soh ON soh.SalesOrderId=sod.SalesOrderId

JOIN SalesTerritory t ON t.TerritoryId=soh.TerritoryId

JOIN Customer c ON c.CustomerId=soh.CustomerId

JOIN Store st ON st.BusinessEntityId=c.StoreId

JOIN Product p ON p.ProductId=sod.ProductId

WHERE soh.OrderDate BETWEEN '2005-01-01' AND '2008-12-31'

)

WITH DATA

WITH REPLACE

 

The columns in green (initialized with NULLs) represent storage for the calculations that need to be made and populated in the table. Next, consider the columns highlighted with red. This is the mistake that I often see: Temporary tables are asked to store too much data at the first step in the process. The columns in red are either not used in the report (SalesOrderDetailId) or are prematurely collected at this step.

 

Why do I say “prematurely”? Because the hypothetical stored procedure will have to process this table many times while doing the purchase order cost or manufacturing cost calculations. These extra columns add to the row width and therefore require disk and memory to store and process. If there are tens or hundreds of thousands of rows in the temporary table, that equates to more disk I/Os, a larger memory requirement to process the data, and more CPU time to process all of the rows in the table.

 

To improve efficiency, do not store columns like names and descriptions in your temporary tables unless the tables will be relatively small or the data is required for immediate use. In the above scenario, I wouldn’t retrieve the columns in red until the final step of the procedure that returns the result set to the caller.

 

One other item to consider about row width is that appropriately sized data types can help as well. Consider these column definitions:

 

CAST(NULL AS TIMESTAMP) AS FirstWorkOrderStartDate,

CAST(NULL AS TIMESTAMP) AS LastWorkOrderStartDate,

CAST(NULL AS INT) AS NoDaysToBuild,

CAST(NULL AS DEC(38,4)) AS AvgPOCost,

 

The developer may have picked these data types because they matched what was in the base table. However, many potential improvements can be made, such as changing a TIMESTAMP to a DATE (assuming the time portion isn’t required for the final report) and changing the “average PO cost” and “number of days to build” to reasonable sizes:

 

CAST(NULL AS DATE) AS FirstWorkOrderStartDate,

CAST(NULL AS DATE) AS LastWorkOrderStartDate,

CAST(NULL AS SMALLINT) AS NoDaysToBuild,

CAST(NULL AS DEC(13,4)) AS AvgPOCost,

 

By my calculation, the illustrated change in data types will save somewhere in the neighborhood of 30 bytes/row. I’ll discuss the importance of choosing appropriate data types in more detail in a future tip. For now, just note that reducing the number of bytes per row can allow the database engine to cram more rows on a single disk page, which also translates to lower memory and I/O usage when processing the table. For a table with a large number of columns and rows, this data type optimization can yield large benefits.

 

At election time, the joke “vote early, vote often” is often repeated. I have a similar phrase for SQL coding: “filter early, filter often,” which means filter as much data as possible as early as possible within the code.

 

Another suboptimal programming technique I see with temporary table usage is the removal of data from it, just after DB2 took the time to collect it and insert it! After an INSERT is done into a temporary table, sometimes I’ll see statements like this immediately following:

 

-- Remove misc items from temporary table

DELETE FROM session.SalesData s

WHERE EXISTS (

SELECT *

   FROM Product p

   WHERE p.ProductId=s.ProductId

     AND p.ProductType='M' -- Misc Item

)

 

This is a somewhat trivial example, but it does occur in production programs. The INSERT statement itself should filter out these rows! Besides making DB2 insert and then delete data, it can make processing less efficient if enough rows are deleted.

 

Recall that the database engine stores row data in “pages” on disk. If these disk pages are full, DB2 is going to be more efficient because each page read will give DB2 a maximal amount of row data to process. If, for the sake of argument, 20 percent of the rows are removed from the temporary table with a DELETE statement, when DB2 reads the table, for each page of data read from disk, DB2 is going to get only 80 percent of the rows that it could have retrieved had the disk page been completely full with row data. Again, this scenario isn’t too much of a concern if the data in the table is relatively small.

 

I would only consider the use of a separate DELETE statement if:

  • the criteria were so convoluted that it would make the INSERT statement very difficult to understand

  • it would hinder DB2 from creating an efficient access plan for the INSERT statement.

 

Allow DB2 to Do Easy Lifting

If your boss asks you to move a load of bricks from his car to the office basement (while still doing your normal work), would you hope for a big pile or a small pile? Obviously, if it’s a large pile, it’s gonna take a while and potentially delay completion of your other essential tasks. Likewise with DB2, the fewer pages (aka bricks) of data it has to process, the faster it can get the job done and serve other requests.

 

Avoiding cursor usage and optimizing temporary table usage are two good ways of giving DB2 for i “less to lift” in order to get its job done.

 

References

DB2 for i Optimization Strategies, Part 1

DB2 for i Optimization Strategies, Part 2

DB2 for i Optimization Strategies, Part 3

DB2 for i Optimization Strategies, Part 4

 

Michael Sansoterra is a DBA for Broadway Systems in Grand Rapids, Michigan. He can be contacted at This email address is being protected from spambots. You need JavaScript enabled to view it..


BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: