25
Thu, Apr
1 New Articles

ODBC Performance Basics

APIs
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Do you still use any of the very first programs you ever wrote on a midrange system? Probably not. Speaking for myself, I didn't have a clue what I was doing back then! When I first started programming on a System/38, I thought the only way to display multiple rows of data was to use an array and tediously copy the contents of the array to hardcoded lines on the display file.

Then one day (last week...just kidding), I read about this thing called a subfile. My eyes opened, violins played, and the world looked like a beautiful place. (God, I need to get a life.) I realized I didn't need to do all this tedious array work to do what was simple on other systems. Before I knew about subfiles, it would have been easy for me to give up on the System/38 and say it wasn't a capable system because it couldn't do what I had easily done on my trusty HP3000.

Over time, I also found out that I didn't need to do a sort every time I wanted data in a different order: I could create a logical file. I came to understand the PUTOVR and OVRDTA DDS keywords and how they affect communications performance. I slowly built a repertoire of midrange programming skills that allowed me to produce efficient programs.

Well, I have some bad news for you. You may have to repeat much of that process with client/server programming. There's a whole new skill set that you need to start building up. It's good news for me, though; I enjoy a challenge, and I deplore monotony.

Now I keep hearing people complain about the performance of Open Database Connectivity (ODBC). "Holy cow, it took 30 seconds to get 10 records back. That's not practical!" When I hear things like that, I flash back to the time when I thought the only way to display multiple rows of data was to use an array. Just as with RPG programming, you're going to need a new set of knowledge to create client/server programs that use ODBC.

Some techniques, methods, and tricks can make ODBC performance quite reasonable. Response time with IBM's Client Access ODBC driver is also significantly better with the new V3R1 version. What I'm going to do in this article is lay some groundwork. Think of this as ODBC 101.

I'm not going to go into anything terribly advanced. I'm going to talk about the three basic techniques for accessing DB2/400 data using Visual Basic and ODBC and how each method performs. The three techniques are using a data control, using the methods provided in the Professional edition, and using the ODBC APIs. Each technique has varying performance, and, within each technique, the steps you take affect the performance of your applications.

Much of what I will discuss applies to any tool that allows you to use ODBC, but my examples will be in Visual Basic, specifically. For a primer on the terms I'm going to use in this article, see "ODBC Overview" (MC, February 1995).

A Quick Review

ODBC is a standard currently at Version 2.0, with Version 3.0 due the first quarter of 1996. The standard describes a set of application program interfaces (APIs) that are supported through an ODBC Driver Manager (supplied by Microsoft) and ODBC Drivers supplied by different vendors. IBM includes an ODBC driver with V3R1 Client Access and has one available for V2R3 which you can obtain by ordering a set of PTFs.

The ODBC standard describes a set of APIs or function calls that a driver needs to support, and it also describes how you as the programmer should use those functions. The calls have to happen in a specific order for you to get data back from the database server. This is one way you can use ODBC: by writing programs that call the ODBC functions supported by an ODBC driver. This method will yield the best performance, but it happens to be the most difficult of the three techniques I'll discuss here. I'll describe the ODBC API method last and I'll start with the simplest of the techniques.

Using a Data Control

The first method of accessing DB2/400 using ODBC that I'm going to describe is using a data control. In "Custom Controls: The Muscle Behind Visual Basic" (MC, December 1994), I gave an overview of controls. Controls add functionality that isn't available natively in the language. Custom controls also provide an easier means to use certain features that are available in the language. A data control is an example of a custom control that's included with Visual Basic, and it provides a way for your programs to access a database with little or no code.

Most people use the data control at first because it is the simplest method. The data control provides access to a number of different relational database management systems (RDBMSs), including any RDBMS with an ODBC driver.

The data control itself is a visual control (see 1). Two sets of arrows point in opposite horizontal directions. The user can use these arrows to navigate through the set of records you display.

The data control itself is a visual control (see Figure 1). Two sets of arrows point in opposite horizontal directions. The user can use these arrows to navigate through the set of records you display.

When you use a data control, you have to set some properties that determine what database the control connects to and which file (or SQL statement) the control will work against. You can set the properties at design time or at run time. The properties window for a data control is displayed in 2.

When you use a data control, you have to set some properties that determine what database the control connects to and which file (or SQL statement) the control will work against. You can set the properties at design time or at run time. The properties window for a data control is displayed in Figure 2.

In the case of connecting to DB2/400, you need to concern yourself with two properties: Connect and RecordSource. The Connect property is where you put the various parameters you need to use to connect to an ODBC driver. In 2, you can see that I've put the minimum necessary to connect to an ODBC driver.

In the case of connecting to DB2/400, you need to concern yourself with two properties: Connect and RecordSource. The Connect property is where you put the various parameters you need to use to connect to an ODBC driver. In Figure 2, you can see that I've put the minimum necessary to connect to an ODBC driver.

By not specifying an ODBC data source name (DSN) in this case, the ODBC driver manager will prompt the user to select a data source (see 3). The user can select any of the ODBC data sources and the driver manager will prompt the user for any necessary information (for example, a user profile and password).

By not specifying an ODBC data source name (DSN) in this case, the ODBC driver manager will prompt the user to select a data source (see Figure 3). The user can select any of the ODBC data sources and the driver manager will prompt the user for any necessary information (for example, a user profile and password).

In the Connect property, I can also specify a data source name, user profile, and password by using a different connect statement. For example, I might use the following Connect statement:

 ODBC;DSN=MCPGMR;uid=HOOPES;pwd=mypwd; 

I've selected a specific data source (MCPGMR) and passed the driver a user ID and password that I want it to use. That way, the user will not be prompted to pick a data source, and I'm sure that the user will have the necessary authority. I could have created a prompt screen for the user to enter a user profile and password as well.

Once the Connect property is set, we're ready to point to a specific recordset. There are two ways to select records from an ODBC driver. The first is to use the drop-down list for the RecordSource property and select a specific file. The second is to code an SQL statement as the value for the RecordSource property.

When you use the drop-down list, you see a list of files in the libraries that you configured in your ODBC driver. 4 shows the ODBC configuration window for the V3R1 driver. At the bottom of the window is an area labeled Default Libraries. That's where you put the libraries you want access to by default. 5 shows the list of files based on how I have the driver configured in the ODBC configuration window.

When you use the drop-down list, you see a list of files in the libraries that you configured in your ODBC driver. Figure 4 shows the ODBC configuration window for the V3R1 driver. At the bottom of the window is an area labeled Default Libraries. That's where you put the libraries you want access to by default. Figure 5 shows the list of files based on how I have the driver configured in the ODBC configuration window.

2 shows an example of coding an SQL statement into the RecordSource property. Once you've either selected a file or used an SQL statement, you are ready to use the fields from that recordset in your Visual Basic program.

Figure 2 shows an example of coding an SQL statement into the RecordSource property. Once you've either selected a file or used an SQL statement, you are ready to use the fields from that recordset in your Visual Basic program.

Once you've set the properties for the data control, you are ready to bind other controls to the data control. Binding means pointing a control (e.g., a label) to a specific data control and to a field that data control is making available to your program. There are two properties for controls such as labels and text boxes called DataSource and DataField. You change the DataSource property to the name of the data control and the DataField property to the name of the field that has the data that you want to display.

Data Control Performance Issues

When you are using a data control, you are by default using the Jet Engine. The Jet Engine is a shared technology between Microsoft Access and Visual Basic which provides common access methods to all supported databases. Instead of having to know the details of how a given RDBMS implements things such as files and libraries, the Jet Engine hides them from you.

You can create two types of recordsets using a data control. One is called a dynaset, and the other is called a snapshot. A dynaset is an updateable set of records, and a snapshot is a read-only version of that set of records. The performance implications are fairly clear: read-only mode is the faster method.

When using a data control, you are by default creating a dynaset. To create a snapshot, you have to use methods included with the Professional edition which I will discuss shortly. You can also create a read-only dynaset by changing the ReadOnly property to True. In 2, the ReadOnly property is just above the RecordSource property.

When using a data control, you are by default creating a dynaset. To create a snapshot, you have to use methods included with the Professional edition which I will discuss shortly. You can also create a read-only dynaset by changing the ReadOnly property to True. In Figure 2, the ReadOnly property is just above the RecordSource property.

Using the Jet Engine adds some overhead to your programs. The layers are now the RDBMS, ODBC, the Jet Engine, and your Visual Basic program. A way to bypass the Jet Engine will be discussed in the section called "Performance Tips."

The Jet Engine is specific to Microsoft Access and Visual Basic, but the concept is not. For example, if you are using Delphi, it has a similar structure called the Borland Database Engine (BDE). The BDE provides many of the same features as the Jet Engine to help isolate your programs from the specifics of the RDBMS.

Using the Professional Data Methods

The Professional edition of Visual Basic adds a number of features that the standard edition doesn't have. Among the added features are a number of methods that allow you to programatically do what you can do with a data control. A method is a Visual Basic term that identifies a keyword (e.g., CreateDynaset) that performs an action on an object (in this case, a database). These methods let you create dynasets and snapshots, navigate through the data, and update records, among other actions.

6 contains an example of code used to connect to an AS/400, read through the records in a database, and load some of the data into a list box. The OpenDatabase method creates a connection to the AS/400. The parameters are set similar to the way the Connect parameters were set when I was using a data control.

Figure 6 contains an example of code used to connect to an AS/400, read through the records in a database, and load some of the data into a list box. The OpenDatabase method creates a connection to the AS/400. The parameters are set similar to the way the Connect parameters were set when I was using a data control.

The program then creates a Dynaset. It also uses MoveFirst and MoveNext to read through the records in the recordset. The program adds the customer names to a list box that the user will see. The results of the user clicking the Show Data button are shown in 7.

The program then creates a Dynaset. It also uses MoveFirst and MoveNext to read through the records in the recordset. The program adds the customer names to a list box that the user will see. The results of the user clicking the Show Data button are shown in Figure 7.

This technique works well, but we can take some steps to improve performance, such as moving those tasks that take a while and "hiding" them from the user a bit. When I looked at the performance of the application, I noticed that the statement that took the longest was the OpenDatabase method. It took up to 30 seconds to run that command on our D20.

I also knew, however, that opening the database didn't need to happen when the user clicked the Show Data button. Often, we just need to use the OpenDatabase method once in a program. By moving the OpenDatabase method to the start of the application, the user will gain better response time.

Many applications have what is called a splash screen. It's usually a very simple screen that just tells the user to wait because the application is starting. I created a splash screen for my application and hid the code that opens the database behind the splash screen.

That way, when the user clicks the Show Data button, the only steps to run are the CreateDynaset and the MoveFirst/MoveNext group. On our system, those steps take as little as three seconds. That's an acceptable response time for a D20.

I created both a separate module in my project and a procedure called Main. That way I could tell Visual Basic to start the application by calling the subroutine Main in the Project menu item. The code I put in the Main subroutine is in 8. It shows my splash screen (called Startup), turns the mouse pointer into an hourglass, opens the database, hides Startup, and shows the screen that lets the user click the Show Data button.

I created both a separate module in my project and a procedure called Main. That way I could tell Visual Basic to start the application by calling the subroutine Main in the Project menu item. The code I put in the Main subroutine is in Figure 8. It shows my splash screen (called Startup), turns the mouse pointer into an hourglass, opens the database, hides Startup, and shows the screen that lets the user click the Show Data button.

9 shows how this change affected the code associated with the Click event on the Show Data button. If you compare it to the code in 6, you can see that the only difference is the lack of an OpenDatabase command. With this new version of the application, users see the data in just a few seconds when they click on the Show Data button.

Figure 9 shows how this change affected the code associated with the Click event on the Show Data button. If you compare it to the code in Figure 6, you can see that the only difference is the lack of an OpenDatabase command. With this new version of the application, users see the data in just a few seconds when they click on the Show Data button.

Performance Benefits

If you use the methods supplied with the Professional edition, you have much greater control over how the application executes. If you use data controls, the Jet Engine handles all of that with very little code. Now I want to throw one more monkey wrench into the works.

You can combine the first two techniques that I've talked about. The Professional methods I've described work in conjunction with a data control. When you use a data control, you are also creating a recordset. That recordset is one that the professional methods work against. The problem is that you still have less control over how the application executes. The benefits are that you get the ease of connecting to a database using a data control and the ease of use of a visual control so the user can navigate through the records.

By now you may have noticed a three-tier application performance level. The data controls are the easiest to use, but often the slowest. The Professional methods give you greater control over how the application runs, which can help you design faster applications. The final option I'm going to talk about is the fastest way to access data using ODBC: calling the API functions themselves.

Using the ODBC APIs

Using the ODBC APIs is the most difficult technique, but it often results in the best performance. If you are going to use the ODBC APIs, get the ODBC Software Development Kit (SDK) from Microsoft. You can develop applications using the APIs without the SDK, but it will make your job much harder than it needs to be.

The SDK provides you with a number of tools that help during development. One of the tools, ODBC Test (or Gator), allows you to run each of the API functions to see how it works. The Version 2.0 SDK also includes a sample Visual Basic program that provides you with prototype declarations for many of the API functions.

If you've used the Client Access APIs, you know that one of the most challenging tasks is figuring out how to convert the C data types to the equivalents in Visual Basic. "A DWORD C data type is a Visual Basic long data type unless it is a pointer, in which case the Visual Basic data type is an integer unless you're using...."

The sample program has the function declaration that you need for many of the APIs, so you don't have to convert from the C data types to their Visual Basic equivalents. The SDK also includes the Microsoft ODBC 2.0 Programmer's Reference and SDK Guide. This manual describes what each function call does, what order they need to happen in, and what the possible return values mean. It is a necessity when using the APIs as far as I'm concerned.

The program I'm going to show you is essentially the same as the one I created using the Professional methods, except this time I'll be using the APIs. I don't have any error trapping in these programs. The point of this article is performance, not handling errors in ODBC. That doesn't mean it isn't necessary, just that I didn't do it-I didn't want the error handling to distract you.

10 contains the code in the Form Load subroutine. The program puts some values into variables and then makes three API calls: SQLAllocEnv, SQLAllocConnect, and SQLConnect. These three function calls are similar to the OpenDatabase method from the previous example.

Figure 10 contains the code in the Form Load subroutine. The program puts some values into variables and then makes three API calls: SQLAllocEnv, SQLAllocConnect, and SQLConnect. These three function calls are similar to the OpenDatabase method from the previous example.

The first two APIs used in this section of the program allocate memory so the program can connect to the AS/400. SQLConnect makes the connection. I could have used the SQLDriverConnect instead. Depending on how you set the parameters on the SQLDriverConnect function, that might let the user pick from a list of configured ODBC drivers. In this case, I'm specifically selecting a DSN of MCPGMR.

Again, a splash screen is displayed while the three APIs are running. This hides some of the processing time and will make the response time for getting records quite reasonable. The code used to get the data from the AS/400 is in 11.

Again, a splash screen is displayed while the three APIs are running. This hides some of the processing time and will make the response time for getting records quite reasonable. The code used to get the data from the AS/400 is in Figure 11.

The first API call allocates memory for the SQL statement that I'm going to run. The next two API calls run the SQL statement on the AS/400. If you are going to run a statement multiple times, use SQLPrepare and then SQLExecute. That way, to run the SQL statement, you'll just need to run SQLExecute again. If you are going to run a statement just once, you'll get better performance by running the SQLExecDirect function, which combines the functionality of both SQLPrepare and SQLExecute.

The next API function I use returns the number of result fields the SQL statement returned. The program will look through the description of all the fields returned for a field named LSTNAM. It loops through the number of fields returned and uses the SQLDescribeCol API to get the field name. Once the program finds the field called LSTNAM, it uses the counter as an index.

The program loops until no more records are found. It uses the SQLFetch and SQLGetData APIs to get a record and put the data for that record into a field in the program. The program then adds the name to the list box using the AddItem method until the program hits the end of the file.

That's a short description of one way to use the APIs. You should get the manual and the SDK if you are going to use this method. Simply using the APIs gave me quite a speed improvement and the ability to tune the performance of the application.

Performance Tips

If you have the Professional edition of Visual Basic, you can use a special value called DB_SQLPASSTHROUGH. There is a file called DATACONS.TXT included with Visual Basic which has a number of the values like DB_SQLPASSTHROUGH defined. The actual numeric value of DB_SQLPASSTHROUGH is 64, but Microsoft may change that at some time. The best coding practice is not to use the value 64, but declare a constant called DB_SQLPASSTHROUGH to use instead.

You can use SQL Passthrough either on the data control or when you use the CreateDynaset or CreateSnapshot methods. It passes your SQL statement, uninterpreted, to the database server-in this case, DB2/400. An example of that is in the OpenDatabase statement illustrated in 12. For a data control, you need to alter the Options parameter.

You can use SQL Passthrough either on the data control or when you use the CreateDynaset or CreateSnapshot methods. It passes your SQL statement, uninterpreted, to the database server-in this case, DB2/400. An example of that is in the OpenDatabase statement illustrated in Figure 12. For a data control, you need to alter the Options parameter.

By reducing the number of layers involved in the processing of the SQL statement, you can get better performance. The problem is that you lose some support when those layers are involved in the process. For example, some methods (e.g., FindNext) aren't available when you use SQL Passthrough.

Throughout the examples, I've intentionally and consistently done one thing wrong. The rule I broke is this: you should ask only for the data you need. Don't use a SELECT * statement unless you need all the fields. In each case, I should have coded the SQL statement the following way:

 SELECT LSTNAM FROM QCUSTCDT 

You'd be surprised how much of a difference that can make in performance. If the file has large text fields, it can cut a very significant amount of response time.

Index performance will be discussed in a future article, but for now keep in mind the indexes you have available and how they affect performance. For example, if you are using an ORDER BY clause, that may result in the SQL statement having to build an access path. If that is a statement the user will run often or that needs especially good performance, building a logical file in that order will help.

Use some of the experience you've gained writing RPG applications to guide you when writing client/server applications that use ODBC. For example, would you write an RPG program that loads 10,000 records into a subfile and expect good response time? Probably not.

You might write that kind of subfile program to load 250 records at a time as the user browses through the data. Do the same kinds of things when writing a client/server application. Using the Professional methods or the APIs gives you that level of control over how your programs access the database.

Let ODBC Open the Door

I've shown you three different ways to access DB2/400 using ODBC: the data control, the professional methods, and the ODBC APIs. Each technique has different performance levels, and you need to decide which to use based on the response time you need and the resources you have available to create the applications. As you will see in your own experiments, using the ODBC APIs will often give you the best response time, but it takes the longest to code.

Look for some upcoming MC articles to deal with such ODBC topics as optimizing SQL performance in an ODBC environment and the specifics of performance with the V3R1 Client Access ODBC driver.

Don't let rumor and myth guide you in the decision on whether to use ODBC in your client/server applications. You need to experiment with it and gain the same level of experience that you have in writing RPG programs. ODBC is a new technology that opens up the door to a whole world of databases.

Jim Hoopes is a senior technical editor for Midrange Computing.

Reference

Microsoft Corporation. Microsoft ODBC 2.0 Programmer's Reference and SDK Guide. Redmond, Wash.: Microsoft Press, 1992.

ODBC Performance Basics

Figure 1: A Data Control


ODBC Performance Basics

Figure 2: Setting the Properties for a Data Control


ODBC Performance Basics

Figure 3: Prompting to Connect to a Data Source


ODBC Performance Basics

Figure 4: The Client Access ODBC Configuration Window


ODBC Performance Basics

Figure 5: The List of Files from a Data Control



ODBC Performance Basics

Figure 6: Code Used to Connect to DB2/400

 Sub cmdShow_Click () Dim dsCustomers As Dynaset Me.MousePointer = HOURGLASS Set dbMCPGMR = OpenDatabase("", False, False, "ODBC;DSN=MCPGMR") DoEvents Set dsCustomers = dbMCPGMR.CreateDynaset("Select * from QCUSTCDT") DoEvents dsCustomers.MoveFirst DoEvents While (Not dsCustomers.EOF) lbCustomers.AddItem dsCustomers("LSTNAM") dsCustomers.MoveNext DoEvents Wend Me.MousePointer = DEFAULT End Sub 
ODBC Performance Basics

Figure 7: Showing Data from Figure 6 Code



ODBC Performance Basics

Figure 8: Code to Open the Database

 Option Explicit Global Const HOURGLASS = 11 Global Const DEFAULT = 0 Global dbMCPGMR As database Sub main () Startup.Show DoEvents Screen.MousePointer = HOURGLASS Set dbMCPGMR = OpenDatabase("", False, False, "ODBC;DSN=MCPGMR") Startup.Hide Screen.MousePointer = DEFAULT Methods.Show End Sub 
ODBC Performance Basics

Figure 9: Code to Get the Records

 Sub cmdShow_Click () Dim dsCustomers As Dynaset Me.MousePointer = HOURGLASS Set dsCustomers = dbMCPGMR.CreateDynaset("Select * from QCUSTCDT") DoEvents dsCustomers.MoveFirst DoEvents While Not dsCustomers.EOF lbCustomers.AddItem dsCustomers("LSTNAM") dsCustomers.MoveNext DoEvents Wend Me.MousePointer = DEFAULT End Sub 
ODBC Performance Basics

Figure 10: Using the ODBC APIs to Connect to DB2/400

 Sub Form_Load () sDSN = "MCPGMR" nDSNSize = Len(sDSN) sUserID = "" nUserIDSize = Len(sUserID) sPassword = "" nPasswordSize = Len(sPassword) sSQLString = "Select * from QCUSTCDT" nSQLStringSize = Len(sSQLString) nNameLen = 10 nMaxLen = 300& nDataType = SQL_C_CHAR Splash.Show Screen.MousePointer = HOURGLASS nRtnCode = SQLAllocEnv(hEnv) nRtnCode = SQLAllocConnect(hEnv, pConnH) nRtnCode = SQLConnect(pConnH, sDSN, nDSNSize, sUserID, nUserIDSize, sPassword, nPasswordSize) Splash.Hide Screen.MousePointer = DEFAULT End Sub 
ODBC Performance Basics

Figure 11: Using the ODBC APIs to Read Records

 Sub cmdShowData_Click () nRtnCode = SQLAllocStmt(pConnH, hStmt) nRtnCode = SQLPrepare(hStmt, sSQLString, nSQLStringSize) nRtnCode = SQLExecute(hStmt) nRtnCode = SQLNumResultCols(hStmt, nNbrCols) If nNbrCols > 0 Then nCounter = 1 nFound = False Do While nCounter <= nNbrCols nRtnCode = SQLDescribeCol(hStmt, nCounter, sColName, nNameLen, nLenRtn, nSQLType, nColDef, nScale, nNullable) If Left$(sColName, nLenRtn) = "LSTNAM" Then nFound = True Exit Do End If nCounter = nCounter + 1 Loop If nFound Then nRtnCode = SQLFetch(hStmt) Do While Not nRtnCode = SQL_NO_DATA_FOUND nRtnCode = SQLGetData(hStmt, nCounter, SQL_C_CHAR, ByVal sLastName, nMaxLen, nRtnBytes) lbCustomers.AddItem Left$(sLastName, nRtnBytes - 1) nRtnCode = SQLFetch(hStmt) Loop End If End If nRtnCode = SQLFreeStmt(hStmt, SQL_DROP) nRtnCode = SQLDisconnect(pConnH) nRtnCode = SQLFreeConnect(pConnH) nRtnCode = SQLFreeEnv(hEnv) End Sub 
ODBC Performance Basics

Figure 12: An Example Using SQL Passthrough

 Sub cmdShow_Click () Dim dsCustomers As Dynaset, nCount As Integer Me.MousePointer = HOURGLASS Set dsCustomers = dbMCPGMR.CreateDynaset("Select * from QCUSTCDT", DB_SQLPASSTHROUGH) DoEvents dsCustomers.MoveFirst DoEvents nCount = 2 While (Not dsCustomers.EOF) lbCustomers.AddItem dsCustomers("LSTNAM") dsCustomers.MoveNext DoEvents nCount = nCount + 1 Wend Me.MousePointer = DEFAULT End Sub 
BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: