24
Wed, Apr
0 New Articles

V6R1 Breathes New Life into DB2 for i5/OS .NET Provider

Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Learn how to exploit ADO.NET 2.0 enhancements, DAAB, distributed transactions, and more.

 

Over the last couple of years, I wrote several articles and white papers that dealt with Microsoft .NET Framework integration with DB2 for i5/OS. Judging from the readers' interest and feedback, this topic is very popular, especially among those System i users who rely on Microsoft tools for in-house software development. In the past, some of the developers expressed their concern that the DB2 for i5/OS .NET provider did not keep pace with Microsoft's quickly changing .NET specification. Well, I have good news to report: The so-called native DB2 for i5/OS .NET provider that ships with System i Access for Windows V6R1 (formerly known as Client Access) sports an impressive list of enhancements and improvements. In this article, I highlight the most important features of the new version of the provider and illustrate them with practical code examples.

What's New in DB2 for i5/OS .NET Provider?

The DB2 for i5/OS .NET provider relies on services provided by the .NET Framework; therefore, it is sometimes referred to as System i Managed Provider. The namespace for the provider is IBM.Data.DB2.iSeries, and it includes a full set of ADO.NET classes that enable your .NET application to access and manipulate DB2 for i5/OS objects.

 

As mentioned, the new version of the native provider that is included in System i Access for Windows V6R1 contains an impressive list of enhancements:

 

•·        Support for ADO.NET 2.0 abstract classes, which allows database-agnostic programming and basic integration with Visual Studio 2005

•·        Support for 64-bit applications

•·        Support for distributed transactions through System.Transactions object model

•·        Ability to block inserts

 

In addition, the provider adds support for a number of DB2 for i5/OS-specific features, including these:

 

•·        The decimal float and datalink data types

•·        The Query Storage Limit property 

•·        The new special register values 

•·        The LobBlockSize and BlockSize properties that enable better control over the transfer of data

•·        The AutocommitIsolationLevel property that controls the isolation level used during autocommit

The Sample Application Walkthrough

To illustrate the most important new features of the provider, I coded up a sample application called iDB2ProviderTester. The application allows a user to execute several tasks and then it displays the result in a GUI. The main application's dialog is shown in Figure 1:

 

031208JarekFigure1.gif

Figure 1: Here's the main dialog of the sample application. (Click images to enlarge.)

 

The iDB2ProviderTester is written in C#, and I used Visual Studio 2005 with SP1 to facilitate the development process. The Additional Material section at the end of this article contains a link to the downloadable image of the entire solution. I strongly encourage you to download the source code and further study it because it illustrates several programming techniques that I will not be able to cover in this article.

DB2 for i5/OS Connection Setup

The provider supports the ADO.NET abstract classes as defined in the ADO.NET 2.0 specification. One of the benefits of the abstract classes is the ability to snap the provider directly into the Visual Studio's Server Explorer. This gives you the ability to navigate DB2 objects such as tables, views, and stored procedures directly from the IDE and explore their properties.

 

The following steps outline the process of creating a DB2 for i5/OS connection:

 

1.                  In the Visual Studio 2005 IDE, ensure that the Server Explorer panel is opened and anchored. In the Server Explorer panel, right-click the Data Connections icon and select Add Connection. The Add Connection dialog appears. Click the Change button, which is located next to the Data Source Text control. In the Change Data Source dialog that appears, select DB2 for i5/OS as the data source and IBM DB2 for i5/OS .NET provider as the data provider. Click OK.

2.                   Specify the ADO.NET connection settings for the DB2 for i5/OS database (see Figure 2).

 

031208JarekFigure2.gif

       Figure 2: Add the DB2 for i5/OS connection.

 

       Note: IBM DB2 for i5/OS Provider is selected in the Data Source text box. In the DataSource text box, the TCP/IP

host name of the System i machine is specified.

 

3.                  The DB2 for i5/OS server can contain a large number of schemas (libraries). To limit the amount of metadata retrieved from the server, specify a default schema using the DefaultCollection property. You can find it by scrolling down to the SchemaKeywords section. In my case, I use a schema named DB2USER, which contains the DB2 sample database.

4.                  Specify the UserID and Password parameters. These can be found under the Security section.

5.                  Click the Test Connection button (see Figure 2) to ensure that the connection settings are correct. Then click the OK button to create the connection. The metadata is retrieved from the DB2 server, and the newly added connection node appears in the Server Explorer panel.

Accessing DB2 Using Data Access Application Block (DAAB)

The first task performed by the sample application is to retrieve a set of rows from DB2 using Microsoft's Data Access Application Block (DAAB), which is a part of Enterprise Library. The DAAB was created by the Microsoft Patterns and Practices group, and it benefits rapid application development by hiding redundant tasks and providing a layer of abstraction from the database. Over the last several years, DAAB has become quite popular among .NET developers. The DAAB relies on the ADO.NET abstract classes, so I thought that this application block should work just fine with the new provider. Before using DAAB, you need to download the Enterprise Library from MSDN and install it on your workstation. Then, in Visual Studio IDE, you need to add the necessary references to your solution. In my case, I added the following references to the C# solution:

 

  • Microsoft.Practices.EnterpriseLibrary.AppSettings.Configuration.Design
  • Microsoft.Practices.EnterpriseLibrary.Common
  • Microsoft.Practices.EnterpriseLibrary.Data
  • Microsoft.Practices.EnterpriseLibrary.Data.Configuration.Design

One of the main advantages of using application frameworks such as DAAB is the ability to decouple the application from any database- or server-specific settings. You can achieve this goal by creating an application configuration object that contains deployment-specific configuration data. To do so, in Visual Studio IDE in the Solution Panel, right-click the solution node (in my case, it is iDB2ProviderTester) and select Add > New Item from the context menu. In the Add New Item dialog, select Application Configuration File and click OK. A new file called App.config is added to the solution. Now, since, you have Enterprise Library installed on your workstation, you can edit this file using the Enterprise Library editor. Right-click the newly created App.config file and select Edit Enterprise Library Configuration. The editor opens in the main IDE's panel. This is shown in Figure 3.

 

031208JarekFigure3.gif

Figure 3: Edit the application configuration.

 

In the editor, right-click the Connections String node and select Add > Connection  String. A new Connection String object is shown in the Properties panel. Edit it to look similar to the one shown in Figure 4.

 

031208JarekFigure4.gif

Figure 4: Add a connection string.

 

Note:  The ConnectionString can contain any of the provider-supported keywords, such as DefaultCollection, Naming, LobBlockSize, and so on. These are DB2 for i5/OS-specific attributes.

 

Next, in the editor (Figure 3), click the Data Access Application Block node. In the Properties panel, set the DefaultDatabase property to the newly created connection string name; in this case, it is myDB2fori5OS. This makes sure that the application will use this particular database connection as the default database. Save the configuration changes (Ctrl-S).

 

Now with all the basic configuration tasks off the table, we can finally analyze the code snippet that is used by the application to retrieve a set of rows, bind the results, and display them in a DataGridView object:

 

Database db = DatabaseFactory.CreateDatabase();                         [1]

string sSql = "SELECT e.EMPNO,FIRSTNME,LASTNAME,WORKDEPT,PICTURE 

FROM EMPLOYEE e, EMP_PHOTO p WHERE e.EMPNO=p.EMPNO AND (p.PHOTO_FORMAT =

@format)";                                                                    [2]

DbCommand cmd = db.GetSqlStringCommand(sSql);                                 [3]

string format = "gif";

db.AddParameter(cmd, "@format", DbType.String, format.Length,

ParameterDirection.Input, false, 0, 0, "", DataRowVersion.Current, format); [4]

DataSet ds = db.ExecuteDataSet(cmd);                                          [5]

this.displayResultsInDgv(ds);                                                 [6]

 

In the code sample above, at [1] the DAAB is used to instantiate a Database object. Note that the CreateDatabase method requires no parameters. If no database name is specified, the DAAB will use the default database connection string that was previously defined in the App.config configuration file.

 

At [2], a parameterized SQL query is constructed. At run time, this query will expect one input parameter to specify the value of the PHOTO_FORMAT column. 

 

At [3], DAAB creates a concrete class that is cast to the base, abstract DbCommand type, therefore making the code database independent. Under the covers, the cmd has been created by the native provider, so the actual type of this object is iDB2Command.

 

At [4], the required parameter is defined using the AddParameter method. Note that currently the parameter needs to be defined programmatically. The DAAB DiscoverParameters method does not support the native provider. This is similar to other providers such asOleDb and Odbc. Additionally, as of this writing, you need to specify the length of a string parameter on the AddParameter invocation.

 

At [5], the SQL query is executed, and the resulting DataSet is created.

 

I encourage you to review the source code for the displayResultsInDgv (shown at [6]), where I illustrate how to use a BindingSource object to programmatically bind a DataSet to a DataGridView control.

Accessing DB2 Using the Base Abstract ADO.NET Classes

The provider object model in ADO.NET 2.0 defines a series of base classes in System.Data.Common. These classes provide a basic implementation of common functionality for all providers and allow you to write database-agnostic applications. In ADO.NET 2.0, each data provider registers a ProviderFactory class and a provider string with the .NET framework. The ProviderFactory contains static methods that are used to create the classes required by the ADO.NET object model, such as DbConnection, DbCommand, etc. I used this programming technique to implement the second task in the sample application. Here's the relevant code fragment:

 

DbProviderFactory provider = DbProviderFactories.GetFactory(

DBHelper.ProviderName);                                                    [1]

DbConnection con = provider.CreateConnection();

con.ConnectionString = DBHelper.ConnectionString;

DbCommand com = provider.CreateCommand();                                  [2]

com.Connection = con;

com.CommandText = "SELECT e.EMPNO, FIRSTNME, LASTNAME, WORKDEPT, PICTURE 

FROM EMPLOYEE e, EMP_PHOTO p WHERE e.EMPNO=p.EMPNO AND (p.PHOTO_FORMAT = @format)";                                                                    [3]

com.CommandType = CommandType.Text;

 

discoverParameters(com);                                                   [4]   

com.Parameters["@format"].Value = "gif";

DataSet ds = new DataSet();

DbDataAdapter ad = provider.CreateDataAdapter();                           [5]

ad.SelectCommand = com;

con.Open();

ad.Fill(ds);                                                               [6]

con.Close();

this.displayResultsInDgv(ds);

 

In the code sample above, at [1] the DBHelper class is used to retrieve the deployment-specific values for provider name and connection string. DBHelper relies on the information that I provided earlier in the App.config configuration file. The provider is then used at [2] to create a command object that is cast to the base DbCommand type. In this case, the concrete class created by the provider is iDB2Command.  

 

At [3], a parameterized SQL query is constructed.

 

At [4], I invoke a helper method that I use to derive parameters from a command object. The helper method is needed because the DbCommandBuilder base class does not contain a DeriveParameters method. This method is, however, supported by a number of concrete command builder classes, such as iDB2CommandBuilder, OleDbCommandBuilder (implemented by the OleDb provider), etc. Here's pseudo code that shows how the iDB2CommandBuilder is used to invoke the provider-specific DeriveParameters method:

 

private static void discoverParameters(DbCommand command)

  {

    ...  

    DbConnection connection = command.Connection;

    connection.Open();

    if (connection is iDB2Connection)

      {

        iDB2CommandBuilder.DeriveParameters((iDB2Command)command);

      }

 

So the helper method allows me to derive parameters in a database-agnostic manner.

 

Let's continue now with the code analysis. At [5], the provider is used to create a data adapter class. The adapter is then used at [6] to fill the DataSet with data retrieved by the query I defined earlier at [3].

Batch Insert

Sometimes you may need to populate a table with initial data, say, at the application startup. Single row inserts can, in this case, be very slow and impractical. The native provider now supports a single execute command to perform multiple INSERT statements. Specifically, the concrete class iDB2Command implements the AddBatch method, which can be used to perform batch inserts. The following code snippet illustrates these concepts:

 

DbCommand com = provider.CreateCommand();                                  [1]

com.Connection = con;

  

com.CommandText = "INSERT INTO COFFEES (COF_NAME,SUP_ID,PRICE,SALES,TOTAL)" +

                  " VALUES (@COF_NAME, @SUP_ID, @PRICE, @SALES, @TOTAL)";  [2]

com.CommandType = CommandType.Text;

discoverParameters(com);                                                   [3]

con.Open();

for (int i = 1; i <= batchSize; i++)

  {

    com.Parameters["@COF_NAME"].Value = "Super_Kona_" + i;

    com.Parameters["@SUP_ID"].Value = 150;

    com.Parameters["@PRICE"].Value = 9.95;

    com.Parameters["@SALES"].Value = 1000;

    com.Parameters["@TOTAL"].Value = 9950;

    batchInsert(com);                                                      [4]   

  }

com.ExecuteNonQuery();                                                     [5]

con.Close();

 

In the code above, at [1] a command class is instantiated. At [2], a parameterized INSERT statement is defined. At [3], I use the discoverParameters helper method to derive the statement's parameters. This helper method is described in the previous section (Accessing DB2 Using the Base Abstract ADO.NET Classes). At [4], another helper method, batchInsert, is called. I wrap the AddBatch method, which is specific to the native provider, in a helper method to keep the main application database-agnostic. Here's the batchInsert method's source code for your reference:

 

private static void batchInsert(DbCommand command)

 {

  if (command == null) throw new ArgumentNullException("command");

  if (command is iDB2Command)

   {

    ((iDB2Command)command).AddBatch();

   }

  else

  {

   throw new Exception("AddBatch not supported by the provider");

  }

 }

 

Finally, at [5], the ExecuteNonQuery method sends all batched requests as a single INSERT statement to DB2.

 

There are several things to keep in mind when using the AddBatch method:

 

  • Parameter markers need to be used to pass data to the command object. Mixing parameter data with constants or literals will result in an SQL error.
  • The statement must not contain LOB locators. You can force the provider to materialize LOBs by setting the connection attribute MaximumInlineLobSize to 15360.
  • Errors that occur during the batch insert may cause unpredictable results if the statement executes with autocommit on. Some rows may be written into the database and some may not.
  • System i has a limit of 32767 rows that can be sent in a single batch.

Distributed Transactions

One of the biggest changes that occurred with the introduction of .NET 2.0 was the introduction of the System.Transactions namespace. System.Transactions offers an easy and straightforward way to implement transactional operations. It can be used to implement a transaction in a number of ways, but a typical implementation is to wrap transactional operations inside a TransactionScope class. The TransactionScope class facilitates automatic distributed transaction enlistment. Keep in mind, though, that the native provider works with the Microsoft Distributed Transaction Coordinator (DTC) in order to perform distributed transactions, even when only a single data source is being used. This obviously has performance consequences. The DTC service must be running and enabled for XA transactions before you try to start a distributed transaction using the provider. The following steps outline the process of setting up the DTC to work with the native provider:

 

  1. On your Windows workstation, open the Component Services dialog. In the left panel under Console Root, navigate to Component Services > Computers > My Computer. Right-click the My Computer icon and select Properties. The My Computer Properties dialog appears. Click the MSDTC tab.

 

  1. In the Transaction Configuration section, click the Security Configuration button. This opens the Security Configuration dialog. Set the MSDTC properties as shown in Figure 5.

 

031208JarekFigure5.gif

Figure 5: Set your MSDTC configuration properties.

 

Make sure that the Enable XA Transaction option is selected. Start the DTC. You may change the DTC service configuration so that it starts automatically after the workstation is booted.

 

The programming technique that employs the TransactionScope class to implicitly enlist on a distributed transaction is pretty straightforward, as illustrated in the following code sample:

 

DbProviderFactory provider = DbProviderFactories.GetFactory(DBHelper.ProviderName);

using (TransactionScope scope = new TransactionScope())                    [1]

 {

  using (DbConnection conn = provider.CreateConnection())                  [2]

  using (DbConnection conn2 = provider.CreateConnection())                 [3]

   {

    conn.ConnectionString = DBHelper.ConnectionString;

    conn.Open();

    using (DbCommand cmd = conn.CreateCommand())

     {

       cmd.CommandText =

           "INSERT INTO COFFEES (COF_NAME, SUP_ID, PRICE, SALES, TOTAL)" +

           " VALUES ('Colombia Supreme Select', 200, 19.95, 10, 199.50)";

       cmd.ExecuteNonQuery();                                              [4]

     }

    conn2.ConnectionString = DBHelper.ConnectionString;

    conn2.Open();

    using (DbCommand cmd = conn2.CreateCommand())

     {

       cmd.CommandText =

            "INSERT INTO COFFEES (COF_NAME, SUP_ID, PRICE, SALES, TOTAL)" +

            " VALUES ('Kenya Gold', 999, 14.95, 10, 149.50)";

       cmd.ExecuteNonQuery();                                              [5]

     }

   }

  scope.Complete();                                                  [6]

 }                  

 

In the source listing above, at [1] an instance of a TransactionScope class is created. Since at this time no transaction context exists in the sample application, the .NET framework starts a new transaction. Now all operations within the using block will be performed in the context of this newly initiated transaction.

 

At [2] and [3], two separate connections to DB2 are created.

 

Then at [4] and [5], two INSERT statements are executed. Note that each statement runs through a separate connection. So, if it hadn't been for the distributed transaction, the two statements would have run in two separate database transactions, and I would have had no ability to commit both changes in a consistent manner. In this case, however, the Complete method call at [6] causes the TransactionScope, MSDTC, and DB2 to use the two-phase commit (2PC) protocol to complete the transaction and make both changes permanent.

 

The native provider supports the XaLockTimeout and XaTransactionTimeout properties that can be specified on the ConnectionString to give the application more control over distributed transactions.

 

Actually, it is quite interesting to watch a distributed transaction running across all participating software components. As soon as any transactional operation is performed within the transaction scope, MS DTC starts a new global transaction. This is shown in Figure 6.

 

031208JarekFigure6.gif

Figure 6: The active global transaction is managed by MS DTC.

 

The global transaction depicted in Figure 6 gets propagated to DB2 for i5/OS database server, where it shows as two separate entries. Consider Figure 7 below:

 

031208JarekFigure7.gif

Figure 7: Here's a global transaction under DB2 for i5/OS control.

 

Note that the two entries reported by DB2 have the same global ID but different Branch Qualifiers. A transaction branch encapsulates transactional operations performed over a given database connection. The sample application opens two connections; hence, there are two transaction branches. 

 

One more caveat worth mentioning is that the two branches do not share locks. In other words, currently the native provider does not support loosely coupled transactions that share locks. The global transactions are said to be "loosely coupled" when the transaction identifiers (XIDs) of two transaction branches have the same global transaction identifier (GTRID) but different branch qualifiers (BQUALs). By default, loosely coupled transaction branches do not share locks.

Provider Trace Utility

The provider ships with a tracing utility, which can be used to collect detailed client-side traces. The .NET provider tracing is turned off by default. It can be enabled by calling CWBMPTRC program on Windows. Here's an example of how to switch on tracing:

 

cwbmptrc +t

 

The trace file by default is named IDB2TRACE.TXT and is stored in the C:Documents and SettingsAll UsersDocumentsIBMClient Access directory. Remember that with traces turned on, performance may suffer, and the trace file will continue to grow until turned off.

 

Taking Advantage of the Native Provider

The IBM DB2 for i5/OS .NET provider has been vastly improved in V6R1, and it should be your primary choice if your application accesses mainly DB2 on System i. The native provider uses highly optimized wire protocol to communicate with DB2, and the appropriate license is included in System i Access for Windows.

 

Additional Material

Download the source code that accompanies this article.

 

The following publications can be helpful to those who want to learn more about the topics covered in this article:

 

Integrating DB2 Universal Database for iSeries with Microsoft ADO .NET, SG24-6440

 

".NET Integration with DB2 UDB for iSeries," MC Press Online

 

"Cut Your Development Effort with DB2 Development Add-in for .NET," MC Press Online

 

Build Web Services in a Flash, IBM white paper

Jarek Miszczyk

Jarek Miszczyk is a Lead Technical Consultant for System x Virtualization and Cloud Computing at the IBM STG Global ISV Enablement organization. He is located in Rochester, Minnesota. He can be reached by email at This email address is being protected from spambots. You need JavaScript enabled to view it..

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: