25
Thu, Apr
1 New Articles

Programming with Make on the AS/400, Part I

General
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Not so long ago, I found myself in an odd position: that of being simultaneously a newbie AS/400 programmer and a veteran UNIX programmer. Some things were familiar: source code, object code (called ILE modules on the AS/400), and programs. But right away, I started looking for a utility like UNIX Make. From my perspective, our AS/400’s PDM tools just didn’t cut it, and the available programming environment tools for the AS/400 didn’t fit my needs. Fortunately, with the help of an AS/400 guru, I discovered a version of my old buddy Make, and it just happens to be a free tool available on every AS/400!

"I Use XYZ. Why Bother with Make?"

UNIX, Windows/DOS, and OS/2 programmers have long enjoyed the benefits of the Make utility. Although it was developed in the UNIX/C world, all programmers and even system administrators can use Make to simplify modular program creation. And actually, with Make, you can effectively automate any multistep process that has a time-of- modification relationship between the objects being manipulated.

Programmers call Make a “recompilation” tool. They use it to build modules from source code, and programs from modules. Thus, Make fits the AS/400’s ILE programming model perfectly. The magic of Make is that it can help you optimize the build so that modules that don’t need to be recompiled aren’t. This can save a programmer a lot of time during development and maintenance, when often small incremental changes are being made to only a few modules. Optimizing the process with Make allows quick checkout and turnaround for enhancements and bug fixes.

Make solves a problem faced by all programmers using modular languages: Maintenance commands (compilation, program generation, and so forth) often need to be executed when an object (or one or more of a set of objects), called the “dependent,” has been updated more recently than another object (or one or more of another set of objects),

called the “target.” Usually, as in the code development cycle, the target is updated via the maintenance commands, using the contents of the dependent. In the ILE development cycle, a common dependent is a source code member, and its target is the module generated from it.

So, of course, there’s no magic. Make works by comparing the creation date of an existing module against the date of the source code object that it is to be compiled or recompiled from. Instead of keeping a mental list of the modules affected by a change to source code, you maintain the dependency list in a Makefile, the input to Make.

Thus, a Make-based build can serve to formalize a software build procedure. The Makefile contains a set of rules, or description blocks, that you provide to perform a particular task, like creating a program. You could use a command script to formalize your build procedure, but you would have to write a lot of code to give your script the functionality built into Make. A well-formed Makefile represents a dependency tree for building a project, whether from scratch or from a mostly up-to-date set of objects. With Make and the right Makefile, a programmer can issue a single build command and be assured that the resulting program is up-to-date and built using the smallest number of steps.

But does Make really work for AS/400 projects? Suppose you have a really big RPG IV project with lots of modules and database files. If you properly configure the Makefile, you can make a simple source code change in one of the modules and just run the Make command; the source (for the single module) will be recompiled, and Create Program (CRTPGM) or Update Program (UPDPGM) will recreate the program. Similarly, you can change a database file’s DDS source and then run Make to recompile the DDS, recompile the DDS for any of the database file’s dependencies (e.g., logical or reference files), recompile any RPG modules dependent on the recompiled DDS, and finally recreate the program (or, for a multiprogram project, all the programs of the project)—all in one Make command step.

The real utility of Make is apparent in large, complex projects with many modules and scattered dependencies, when the modules you are changing just happen to have few or no dependencies. In that case, only the modules you change will be recompiled, and the larger portion is safely unchanged, saving you the time and resources required to recompile all the modules of your project. It can also be useful in development to let Make find the dependencies you may have forgotten about when making changes to code that is used in many different places. When you forget, Make reminds you when you get compiler errors in the affected code.

TMKMAKE: Make for the AS/400

IBM supplies a version of Make for the AS/400 called TMKMAKE. It is one of the tools supplied in the Example Tools Library, QUSRTOOL (see the sidebar “Installing TMKMAKE” for information on how to install TMKMAKE). And this is probably the best reason for at least evaluating TMKMAKE: It’s free.

For those familiar with Make, TMKMAKE is most similar to the UNIX flavor of Make. The basic mechanics of TMKMAKE are identical to UNIX Make, but the developers of TMKMAKE needed to add special tags for handling the unique object-based layout of the AS/400’s QSYS file system. If you’ve never used Make, using these will not be a problem. But if you have used Make before, the tags can be confusing at first. Also, if you are used to the latest versions of Make on other platforms, TMKMAKE may be missing some of the newest bells and whistles in comparison.

In the following sections of this article, you will learn how to create Makefiles for use with TMKMAKE, with examples in RPG and CL code. Make can be a useful tool for programming in all languages but, in particular, modular compilation unit languages like the ILE languages.

What’s Included

TMKMAKE consists of several programs, but there is only one interface to run Make: the TMKMAKE command. You must add the library in which you installed TMKMAKE to the library list to use the TMKMAKE command. Otherwise, you’ll see odd-looking errors when you try to run it. In the following sections, I’ll explain the required and most-often-used parameters of TMKMAKE, and then I’ll explain some of the lesser-used parameters.

Creating and Using Makefiles

Makefiles on the AS/400 are source physical file members. As mentioned previously, a Makefile contains a set of description blocks. Description blocks have the following syntax:

targets : dependents
commands

The first line lists the targets and dependents to be built and built from. Again, a target is an object to be produced by your build process (alternatively, it can be a “pseudotarget,” which I’ll explain later). On the AS/400, target objects can be ILE modules, programs, files, or file members. Dependents are the objects that, if updated, must cause the target to be refreshed. For example, an ILE program object is dependent on one or more ILE modules (that is, the ILE modules must exist before the program can be created). The first character of the target must be in column one, and the targets are separated from the dependents by a colon character. The remaining lines of the description block are commands used to generate a target object. The commands are indented from column one by at least one “whitespace” (or blank) character. They are executed in the order they are listed. Make examines the return code for each command executed and, if an error is detected, aborts the build. A command list is ended with a blank line or the start of another description block.

TMKMAKE will send error messages to the job log when these syntax rules are violated. If the target description does not begin in column one but is separated from the prior description block by a blank line, TMKMAKE interprets the line as part of a command list without a target description, and it will report a nice error telling you so before aborting. If the target description does not begin in column one and is not separated from the prior description block by a blank line, TMKMAKE interprets the target description as part of the command list of the previous description block. In this case, TMKMAKE may not detect the error until the target is referenced (“target not found”) or until the command list of the previous target is executed (when the target description will be interpreted as an ill-formed command). Conversely, if a command line in a description block begins in column one, TMKMAKE will think it is a target description and will complain about an “invalid description rule” on the erroneous line before aborting.

Consider the very simple example Makefile in Figure 1. This Makefile consists of just one description block, and it is used to compile an RPG source code member into an ILE module.

The target, an ILE module named DEBUG, is to be built by the description block. It has only one dependent, the source member DEBUG in the file QRPGSRC. The funny

notation for files and members is a contrivance of the TMKMAKE implementation. It helps distinguish the various object types and file members in OS/400. The only command in the command list is Create RPG Module (CRTRPGMOD) with the appropriate parameters. Note that (assuming no coding errors) this command creates a module named by the target. To create the module with this Makefile, invoke TMKMAKE with a command that looks something like this:

TMKMAKE MAKETARG(‘debug’) +

SRCFILE(*CURLIB/QMAKSRC)

The SRCFILE parameter tells TMKMAKE which source physical file contains the Makefile (the member, in this case, defaults to *FIRST), and MAKETARG specifies the target to be evaluated by Make. Note that the target specification is case-sensitive. So far, this trivial Makefile is not worth the effort; it’s probably easier just to run the CRTRPGMOD command directly to create the module. Later, I’ll embellish this example to make it useful.

Dependents can be Makefile targets, too, but not in the same description block. This is one of the most powerful features of Make. When Make evaluates a description block, the dependents are scanned first to make the following determinations about their properties:

• Do all dependents exist as objects in the file system? If not, those dependents must exist as targets in other description blocks. Make will stop with an error if no Makefile target exists for a nonexistent dependent. When dependents exist elsewhere in the Makefile as targets, description blocks for those targets are evaluated before the evaluation of the current description block proceeds. Make always executes the command list of description blocks that contain dependents that are targets but not file system objects (these are special targets called “pseudotargets,” which I will discuss in detail later).

• Have the dependents been updated more recently than the target? If all dependents exist and are older than the target, Make considers the target to be “satisfied” and will not execute the command list. However, if any existing dependent is younger than the target or if the target simply doesn’t exist, Make executes the command list, ostensibly to update the target.

So, to have a useful Makefile, there should be more than one description block, among which some of the targets are dependents of other targets. In later, more realistic examples, this kind of Makefile structure will be used to create a program from multiple modules. First, however, it will be useful to examine a little more closely the odd notation used to express OS/400 objects as targets and dependents in TMKMAKE Makefiles.

Organizing a TMKMAKE-enabled Project

In the C world, where Make was developed, source code files are named with a basename and an extension, as in debug.c. The basename roughly describes the purpose of the code, and the extension describes the language or “class” of the code.

Make uses the file extension, also called the suffix, in “implicit” targets, a feature that I’ll describe later. TMKMAKE’s implementers decided to use the natural object boundaries of OS/400 to express the notion of extensions. So source member DEBUG of file QRPGSRC would be expressed as DEBUG.QRPGSRC or debug.qrpgsrc (unlike its MAKETARG parmeter, TMKMAKE is case-neutral in resolving target and dependent names specified in the Makefile). Thus, source members are classified by the file that contains them.

To use TMKMAKE most effectively, it is best to keep all source members for a language or compiler in the same file, thereby giving all source members of a given language the same Make extension name. So, for example, the QRPGSRC file contains the

RPG source members, QCLSRC contains the CL source members, and so forth. The file names can be any name; however, to take advantage of TMKMAKE’s implicit targets, you should use the extensions for the various languages shown in Figure 2.

Source members will usually be specified as dependents in description blocks. Going back to the first example then, see that the source member DEBUG of file QRPGSRC is specified as debug.qrpgsrc. The suffix specifies the kind of object it names, in this case, a database source physical file. TMKMAKE interprets basename.extension notation names with the suffix correctly where the extension is a source file in the current library and basename is a member in that file.

Similarly, and suffixes denote program and module objects of the given name. Note that the library in which the object resides may be specified as well, as in projlib/debug.qrpgsrc. In general, it is best to group the source code for a particular project in a single library, with files for each language compiler used to create the program for that project. Then, you can execute a Change Current Library (CHGCURLIB) command to that library prior to running TMKMAKE to allow simpler and less complicated object specifications in the Makefile.

By default, TMKMAKE will use *LIBL/QMAKSRC(*FIRST) as the source member containing the Makefile. So, it is also advisable to have a QMAKSRC file in your project library containing the Makefile.

Putting Make to Work

Suppose you want to use TMKMAKE in a true RPG project consisting of three ILE modules and one program. You must build a program object containing the three ILE modules. Figure 3 illustrates a Makefile used to build such a project.

In this example, there are two RPG modules, debug01 and debug02, generated via the CRTRPGMOD command in the first two description blocks. The CL module, debug01c, is generated by the Create CL Module (CRTCLMOD) command in the third description block. The program will be built with the CRTPGM command with the three modules specified as parameters in the fourth and final description block.

The description block for target ‘debug’ has three dependents—the modules built by the other description blocks. To use this example Makefile, save it as the first member of the QMAKSRC source file in the project library and invoke TMKMAKE with the MAKETARG(‘debug’) parameter to build the program as follows:

TMKMAKE MAKETARG(‘debug’) SRCFILE(*CURLIB/QMAKSRC)

In the Makefile, note the ‘’ character at the end of the first line of the command section of the debug description block. Long lines in description blocks may span more than one line in a Makefile; to split a long line, use a ‘’ character as the last non- whitespace character on the line to be continued on the following line.

Now, examine the mechanics of the Makefile. The debug target’s description block is evaluated first, as specified by the command. Examining the target’s dependents left to right, Make finds that debug01 is the target of the Makefile’s first description block. The evaluation of the debug description block is suspended, and Make then begins evaluation of the debug01 description block. As in the previous Makefile example, the update times of the module and the source member are compared, and, if necessary, the CRTRPGMOD command is run to update the module. Once the target of the debug01 description block is satisfied, the evaluation of the debug description block resumes, and the dependents following debug01 are examined similarly until each is also satisfied. At that point, the evaluation of the debug01 description block continues by comparing the

update times of the modules to the update times of the program. If any one of the three dependent modules is younger than the program, the program is bound with CRTPGM.

In this still-simple example, the number of commands entered by the programmer to create a program from scratch has been reduced from four to one. Additionally, this Make-driven build can now help conserve system resources because it correctly compares target/dependent update times and updates only the objects that are out of date with respect to their dependent objects.

Representing Other Modular Dependencies

Sometimes, an ILE module can have dependencies other than just the source code from which it is generated. Notably, RPG code may be written to access a specific database file or format. To ensure that the code affected by a change to a database format is updated, and thus avoid runtime level-check errors, the database file containing the format is listed as a dependent in the description block that creates the module, as shown in Figure 4.

Here, the description block for the DEBUG01 module target has been changed to add a second dependent, debugdata. In this example, Debugdata is a physical file referenced in the RPG source file member QRPGSRC(DEBUG01). A description block to build it as a target as well, with the Create Physical File (CRTPF) command, is added. Now, when a change is made to the format of DEBUGDATA—described by the DDS source in QDDSSRC(DEBUGDATA)—TMKMAKE will detect and rebuild the database file before recompiling the DEBUG01 module.

Note that any change to the database file will force the recompilation of the RPG module when using this Makefile. Be aware that changing the record format of a database file may require that the logic of a program that uses the file also be changed so the program will continue to operate correctly. This is something you, the programmer, will have to investigate. TMKMAKE cannot determine the need for logic changes or make source code changes for you.

Pseudotargets

Sometimes, it is useful to have targets that do not represent objects. These are called pseudotargets. One of the most common uses of pseudotargets is to create more than one program object in a single Makefile. For example, suppose the project has evolved and now includes three programs. Instead of invoking Make three times to build the three targets, you can define a pseudotarget in which the three targets are listed as dependents, as in the following description block:

all : debug test ship

The most striking feature of this description block is the lack of a command list. No commands are necessary because this pseudotarget is never built! It is simply a placeholder that forces Make to evaluate the dependency list. With this pseudotarget in the Makefile, you can invoke the TMKMAKE command with a MAKETARG(‘all’) parameter, and all three programs—DEBUG, TEST, and SHIP—will be updated. Note that, if a command list were specified in this description block, the commands would always be executed when the target was invoked, since no object named ‘all’ exists in the current library. This can actually be a helpful feature. For example, suppose the TEST and SHIP programs were to be built in a different library than DEBUG. Another pseudotarget to invoke a Change Current Library (CHGCURLIB) command to the new library, as in the following, could be added:

all : debug otherlib test ship

otherlib :

chgcurlib projlib2

The otherlib target has no dependents. Because of that and the fact that it is a pseudotarget, its command list is always executed by Make. In the “all” description block, the otherlib pseudotarget is listed as a dependent (note that, unlike the other dependents, it does not have an extension since it doesn’t represent a file system object). Since dependents are evaluated left to right, the DEBUG program will be updated in the current library before it is changed to PROJLIB2 by the otherlib target. Finally, the other two programs will be updated in the new current library.

Other Makefile Features

The features I’ve shown you will help you get started using Make effectively. However, to wring the most functionality out of Make, you may have to use some of the more esoteric features. I’ll explain those in another article.

debug : debug.qrpgsrc
crtrpgmod module(* curlib/debug) srcfile(* curlib/qrpgsrc)

Figure 1: A very simple Makefile

Language/Compiler File (Extension) Name

CL (CRTCLMOD,CRTCLPGM) QCLSRC

RPG (CRTRPGMOD,CRTRPGPGM) QRPGSRC

COBOL (CRTCBLMOD,CRTCBLPGM) QCBLSRC

FORTRAN (CRTFTNPGM) QFTNSRC

C (CRTCMOD,CRTCPGM) QCSRC

Pascal (CRTPASPGM) QPASSRC

Figure 2: Recommended source code file names

debug01 : debug01.qrpgsrc
crtrpgmod module(*curlib/debug01) srcfile(* curlib/qrpgsrc)

debug02 : debug02.qrpgsrc
crtrpgmod module(*curlib/debug02) srcfile(* curlib/qrpgsrc)

debug01c : debug01c.qclsrc
crtclmod module(*curlib/debug01c) srcfile(* curlib/qclsrc)

debug : debug01 debug02 debug01c
crtpgm pgm(*curlib/debug) module( *curlib/debug01

*curlib/debug02 *curlib/debug01c )

Figure 3: Makefile to build a program object containing ILE modules

debug01 : debug01.qrpgsrc debugdata
crtrpgmod module(*curlib/debug01) srcfile(* curlib/qrpgsrc)

debugdata : debugdata.qddssrc
crtpf file(* curlib/debugdata) srcfile(* curlib/qddssrc) mbr(*none)

Figure 4: The debug01 target has two dependencies

Installing TMKMAKE

You can get TMKMAKE in two ways: You can download the compiled version from the Midrange Computing Web site, or you can compile it from the source code IBM provides in QUSRTOOL.

Downloading TMKMAKE from the Midrange Computing Web Site

Compiled versions of TMKMAKE for V3R2 and V4R2 are available as zip files on the Midrange Computing Web site. Point your Web browser to http://www.midrangecomputing.com/mc/prog/98/ and click on the link for one of the versions.

Once you have downloaded the zip file to your PC, use a zipfile utility to unzip it. This will extract a larger file.

Create a save file on the AS/400.

CRTSAVF FILE(xxx/MAKECODE) +

TEXT(‘TMKMAKE object code’)

Upload the extracted file to the save file. When you upload, be sure to use a binary file transfer. That is, don’t let the transfer software convert from ASCII to EBCDIC. There are probably several ways to do this. FTP is easy and effective.

If desired, create a library (e. g., MAKE) in which to store the TMKMAKE utility.

CRTLIB LIB(MAKE) TEXT(‘TMKMAKE utility’)

Restore the objects from the save file into the desired library.

/* V3R2 */
RSTOBJ OBJ(*ALL) SAVLIB(MAKEV3R2) +
DEV(*SAVF) SAVF(xxx/MAKECODE) +
RSTLIB(MAKE)
/* V4R2 */
RSTOBJ OBJ(*ALL) SAVLIB(MAKE) +

DEV(*SAVF) SAVF( xxx/MAKECODE) +

RSTLIB(MAKE)

Building TMKMAKE from Scratch with a C Compiler

If you don’t have Internet access but you do have a C compiler, you can install the utility from the sources supplied by IBM. TMKMAKE is one of the “unsupported tools” provided in the QUSRTOOL set. Don’t let the “unsupported” tag scare you—I’ve used TMKMAKE for fairly complicated builds on the AS/400 with no problems. Even though it is unsupported, IBM has provided the source code to TMKMAKE, which could be useful in the unlikely event that you do get into a jam. And it installs easily, as detailed below.

Unless you’ve never installed one of the QUSRTOOL packages before, you can probably skip this and the following paragraph. The first step to install the tool is to make sure you have the QUSRTOOL library installed on your AS/400. If the QUSRTOOL library does not exist or is empty, use the OPTION(7) parameter of the Restore License Program (RSTLICPGM) command to install the Example Tools Library.

All of the tools in QUSRTOOL are contained in compressed save files. In the QUSRTOOL library are programs named PACKAGE and UNPACKAGE, and, as you might expect, these programs are used to compress and decompress, respectively, the objects in QUSRTOOL save files. To get the TMKMAKE source code, you need to run UNPACKAGE on three save files in QUSRTOOL by entering the following commands:

CALL PGM(QUSRTOOL/UNPACKAGE) +

PARM(‘QATTCL’ 1)
CALL PGM(QUSRTOOL/UNPACKAGE) +

PARM(‘QATTSYSC’ 1)
CALL PGM(QUSRTOOL/UNPACKAGE) +

PARM(‘QATTCMD’ 1)

For more information on using PACKAGE/UNPACKAGE, check out the documentation in the QUSRTOOL/QATTINFO file, in the TTTINFO member.

Next, create a library for TMKMAKE to reside in, and create the installation program from the CL source in the QUSRTOOL/QATTCL file. Finally, run the installation program to install TMKMAKE. In the example below, I’ve called the target library TMKMAKE:

CRTLIB TMKMAKE

CRTCLPGM +

PGM(TMKMAKE/TMKINST) +

SRCFILE(QUSRTOOL/QATTCL)
CALL PGM(TMKMAKE/TMKINST) + PARM(TMKMAKE)

CL program TMKINST compiles the C source code of TMKMAKE and installs it to your installation library. After it completes, you are ready to write some Makefiles!I

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: