25
Thu, Apr
1 New Articles

Evaluating Appropriate Workloads for the AS/400e Dedicated Server for Domino

Collaboration & Messaging
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Lotus Domino for AS/400, announced in January of 1998, has achieved tremendous success in the marketplace. It offers the rock-solid reliability that customers seek as their email, collaborative, and Web-enabled applications become mission-critical. Through its ease of management and its ability to run a mixed workload on a single server footprint reliably, Domino for AS/400 also delivers a low total cost of ownership. Many customers and Business Partners asked for these same characteristics for pure Domino environments. In response, IBM and Lotus announced the AS/400e Dedicated Server for Domino (DSD).

The simplest and safest implementation of the DSD is a “pure Domino” implementation, such as email and applications that use “out of the box” Domino templates and capabilities. However, many customers and Business Partners want to go beyond these limits, or at least understand what the limits truly are. This article explores the behavior of the DSD and provides guidance on predicting how well various Domino applications will perform on the DSD. For an in-depth discussion of workloads for the DSD, please refer to the Dedicated Server for Domino white paper “Evaluating Appropriate Workloads for the AS/400e,” which can be found at www.as400.ibm.com/domino/dsd.htm.

Classifying Workloads

Potential workloads for the DSD fall into three areas: appropriate, not appropriate, and “it depends.”

Appropriate

Pure Domino applications and most Domino applications that serve as a front-end to legacy data on another AS/400 system fall within the design guidelines of the DSD. You can expect that they will take full advantage of the price/performance characteristics of the DSD. Examples of these applications include the following:

• Domino email


• Domino applications or agents using only Domino databases

• Domino.Doc and other Domino “add in” applications that are predominantly Domino

• Domino applications that use normal integration methods (e.g., @DB, LotusScript Data Object [LS:DO], and Domino Enterprise Connection Services [DECS]) to access DB2/400 databases on another AS/400 server. (Note that these integration methods take advantage of the AS/400 Distributed Relational Database Architecture [DRDA] support.)

Not Appropriate

Some applications fall primarily outside the realm of Domino, use excessive non-Domino capacity, and generally will not perform well on the DSD. These applications include the following:

• Interactive 5250 applications (although sufficient capacity exists for systems administration functions using a few terminals)

• Standalone Java applications

• Applications that access Domino databases without going through a Domino server (e.g., using Domino APIs)

• Domino email integration with other email systems (e.g., Internet mail) via the AS/400 AnyMail framework (For email integration on the DSD, you should use the built-in Domino R5 Simple Mail Transfer Protocol [SMTP] capabilities.)

It Depends

The design of the DSD includes a limited amount of non-Domino capacity that is intended for complementary work such as light database integration or file and print serving. When your workload includes this type of non-Domino work, the performance results you experience will depend on both the mix of work on your DSD and your application design. The majority of this article is devoted to assisting you in evaluating and analyzing whether the level of non-Domino work you anticipate will allow you to take full advantage of the price/performance characteristics of the DSD. The bottom-line recommendation for intensive data integration is either to use a DSD as a front-end to another AS/400 or to run Domino applications on a suitably configured non-DSD server that houses your DB2/400 data.

Processing Capacity

Performance capacities for Domino processing and non-Domino processing on the DSD are defined using three metrics.

Simple Mail Users is a commonly used measurement for comparing Domino capacity across different Domino server platforms and represents the “Domino” capacity of the server. (Note: A typical rule of thumb when equating simple mail users to real-world, or typical, mail users is to divide by three.) Next, Processor Commercial Processing Workload (CPW) is intended to support a limited amount of system resource activity (AS/400 Integrated File System [AS/400 IFS], communications, storage management, backup and recovery, etc.) and Domino application integration functions (DB2 UDB access, external program calls, Java applications, etc.) in support of Domino applications running on the server. Processor CPW defines the amount of non-Domino capacity. And, finally, Interactive CPW is designed to support system administration functions using a


5250 session. You should plan for very minimal amounts of interactive CPW processing on a DSD.

You can think of a DSD’s non-Domino capacity in two ways. First, non-Domino work should not exceed 10 to 15 percent of the total capacity of the processor. In addition, non-Domino work should not exceed 25 percent of the work currently being performed on the server. For example, assume that your DSD is currently running at 50 percent of the CPU’s capacity. The non-Domino work should be no more than 12.5 percent of the total CPU capacity (25 percent of 50 percent). Expressed another way, for your DSD to achieve its full price/performance potential, your ratio of Domino work to non-Domino work should always be 3-to-1 or greater.

The DSD treats processing as Domino work when the processing runs within the Domino R5 code. Processing that runs any function other than Domino R5 code should be considered non-Domino work. Non-Domino work includes functions that a Domino thread or process calls, such as a LotusScript call to SQL. The processing time in the SQL code is non-Domino work. In other words, when an application goes “outside Domino,” it is performing non-Domino work. We’ll explore examples of Domino processing vs. non- Domino processing for application integration in the following section.

Evaluating AS/400 Application Integration on the DSD

Most of the questions that we receive about appropriate workloads for the DSD focus on application and database integration. The only definitive method for ensuring that your proposed application integration will fall within the desired performance guidelines for the DSD is to test it. However, in this section, we’ll provide general guidance and analysis to help you determine whether your proposed workload mix is reasonable and worth expending the effort to test.

Figure 1 shows the common techniques for integrating Domino applications with DB2 UDB files. Methods are classified by whether they are generally considered Domino processing or non-Domino processing on the DSD. From a programming standpoint, these integration techniques are the same on the AS/400 as they are on other Domino platforms. All of the methods shown use the AS/400 SQL Call Level Interface (CLI) to access DB2 data. (A possible exception would be some custom-written connectors.) In addition, when the data being accessed resides on a different server in the network, OS/400 uses the DRDA layer to provide the access. Again, this happens “under the covers” without extra coding by the Domino programmer.

In the sections that follow, we’ll discuss in more detail each of these integration methods and their likely performance implications on the DSD.

Domino Enterprise Connection Services

Domino Enterpise Connection Services (DECS) runs as part of the Domino server (see “Create a Real-time AS/400-Domino Connection with DECS” in the May 2000 issue of MC). A DECS connector is an interface to access back-end databases in real time for Domino users. The data that a Domino user sees looks as if it were coming from a Domino database, but, in fact, it is coming from a relational database outside Domino. Many DECS connectors are available for specific databases such as DB2, Oracle, and Sybase and for applications. The amount of non-Domino work that a connector runs varies based on the design of the connector.

In our testing, the DB2 connector has generally worked acceptably on the DSD because it performs relatively simple DB2 work outside Domino that does not require significant processor resources. The following is an example of integration using the DECS DB2 connector (note which processing steps are Domino vs. which are non-Domino):

1. (Domino) Domino user opens document.


2. (Domino) DECS connector to DB2 sends request to find fields in DB2.

3. (Non-Domino) DB2 receives request.

4. (Non-Domino) DB2 finds row.

5. (Non-Domino) DB2 returns fields to DECS.

6. (Domino) DECS receives fields from DB2.

7. (Domino) DECS sends fields to Domino document.

Because DB2 UDB is integrated on the AS/400 and because this example involves simply fetching or inserting a single row (step 4), the DECS DB2 connector will most likely perform well on the DSD. However, if a DECS connector were to perform complex queries or involved calculations as part of step 4, it may cause a large amount of non- Domino processing and may not perform well on the DSD.

Lotus Enterprise Integrator

Lotus Enterprise Integrator (LEI), previously known as NotesPump (see “Lotus NotesPump: A High-speed Data Pump for Your Enterprise” in the April 1999 issue of MC), is a method for transferring the contents of one database to another. The source and target database can be any combination of supported formats, including Domino, DB2, and SAP. Like DECS, LEI uses a connector to interface the databases. Based on tests and analysis, we have the following recommendations for using LEI:

• Generally, when at least one of the databases in the transfer is a Domino database, Domino processing will be sufficient to maintain a 3-to-1 ratio of Domino work to non- Domino work, and you can expect good server throughput.

• When both the source and target databases are non-Domino databases, you are likely to see non-Domino processing that will exceed the recommended 3-to-1 ratio. Consequently, the performance of the data transfer will be less than the full performance capabilities of your DSD.

• Many customers want to perform large-scale LEI data transfers infrequently, during off- shift hours. In these cases, the reduced performance that might result from exceeding non- Domino performance limits during the data transfer is acceptable. In other words, the data transfer will run successfully. It simply will not be able to take full advantage of the performance capabilities of the DSD.

(Note: The real-time component of LEI has the same characteristics as DECS.)

Accessing External Databases on the Same AS/400 Server

An external (non-Domino) database is a database that is not a Domino .NSF file, such as DB2 UDB for AS/400. From Domino, you can access an external database by using @DB functions or agents written in Lotus-Script or Java using Java Database Connectivity (JDBC) (see “A Melting Pot of Programming Languages” in the February 2000 issue of MC).

Domino provides a set of classes known as the LotusScript Data Object (LS:DO), which allows easy programming access to external databases. On AS/400, the underlying access from LS:DO occurs in two ways, both of which are considered non-Domino processing. First, when the Domino LS:DO processing runs on the server (e.g., as a server


agent), the AS/400 provides direct access to DB2 UDB using the SQL CLI. For other LS:DO access to external databases (e.g., processing running on the client), LS:DO uses an ODBC connection.

Another method is JDBC access. JDBC is a defined interface in Java that allows Java programs to access external data, such as DB2 UDB for AS/400. Java, the JDBC driver classes, and the code that it runs are all classified as non-Domino processing.

How much external database access is OK? Tests show that simple queries from a Domino application to DB2 databases on the same DSD should work acceptably. Factors that could potentially push this SQL CLI workload above the non-Domino limit include running complex queries that are processor-intensive, running many queries simultaneously, running queries against extremely large DB2 databases, and performing complex database access. Accessing DB2 through Java classes and JDBC is roughly equivalent to running a complex query because the Java classes, JDBC, and the access itself are all non-Domino work.

Accessing External Databases on a Different AS/400 Server

When a Domino application accesses an external database on another server, it makes use of the AS/400 DRDA support. From a programming perspective, this uses one or more of the normal Domino techniques for accessing relational data, such as @DB, LS:DO, DECS, or JDBC. Under the covers, these techniques make use of AS/400 DRDA support when the relational data resides on another server. The DRDA call is considered non-Domino work. However, all of the DB2 work to access the database and return the results runs on the connected server that houses the database. Therefore, the processor load for sending DRDA database requests to another machine is light and should generally perform acceptably on the DSD.

Although standard DRDA access will perform acceptably on the DSD, several factors have the potential to increase SQL CLI processing on the DRDA source system and push your non-Domino processing above the recommended ratio. These include large returned data streams, character translation, two-phase commit, and large numbers of SQL DRDA requests per Domino transaction. Also, the buffer format can affect processing if the remote system is non-AS/400 or if it is an AS/400 but the data tends to have a large amount of NULLABLE or VARCHAR fields.

Java Agents or Servlets

Java can be used in many ways on the AS/400. Java is essentially non-Domino, but it can work well on the DSD when the Java processing is light and there is interaction with the Domino server. For Java to run well on the DSD, two things are required. First, Java must be run from Domino. This includes agents written in Java or servlets that are run through Domino’s HTTP task. Second, the Java application must use Domino functions, which are found in the lotus.domino.* or lotus.notes.* classes. These classes contain methods to interface with the Domino server and provide access to Domino databases. If there is no interaction with Domino, the Java processing will cause significant non-Domino processing, which will likely cause the 3-to-1 ratio to be exceeded. Java runs most of its code independently of the Domino server. Remember that external database access using JDBC is non-Domino. Keeping the database access light will increase the chances that your Java will perform well on the DSD.

Predicting Whether a Domino Application Is Appropriate for the DSD

System commands, such as Work with Active Jobs (WRKACTJOB), can be used to get a high-level view of how much processing time is Domino vs. how much is non-Domino. However, non-Domino processing can occur within job structures that would appear to be Domino. Therefore, the most accurate technique for determining the exact amount of


Domino processing is to use the AS/400 Performance Explorer (PEX). The PEX tool can provide detailed information on what percentage of the CPU cycles are running Domino code and what percentage are non-Domino code. Using the PEX tool for DSD application analysis requires OS/400 V4R4 or higher, Domino R5, and the AS/400 Performance Tools licensed program. (A DSD is not required for this analysis.)

The basic technique for using PEX to determine whether a Domino application is a good fit for a DSD is to collect data using the following commands (which can be reached using the Go CMDPEX command):

• Start PEX using the Start Performance Explorer (STRPEX) command.

• End PEX using the End Performance Explorer (ENDPEX) command.

• When you are done, generate a report using the Print PEX Report (PRTPEXRPT) command.

To create the needed PEX definition for the DSD assessment, type the following command:

ADDPEXDFN DFN(TPROFRC5) TYPE(*TRACE) JOB(*ALL) TASK(*ALL) MAXSTG(100000) INTERVAL(5)
TRCTYPE(*SLTEVT) SLTEVT(*YES) BASEVT(*PMCO) TEXT(‘5 millisecond profile based on run
cycles’)

System CPU utilization should be above 50 percent during a 30-minute data collection to ensure a high degree of confidence in the results. For PRTPEXRPT, use the following parameters:

TYPE(*PROFILE) PROFILEOPT(*SAMPLECOUNT *PROCEDURE)

To analyze the PEX report, start on page 3 of the report. Figure 2 shows this page. Look for the line that says QNOTES in the Name column. On this line, observe the value in column Hit % (63.4% in the example report shown in Figure 2). If this value is less than 60%, the amount of Domino processing may not be large enough to maintain a 3-to-1 ratio with non-Domino processing. If the value is above 60%, the application (based on the workload that was active during the PEX data collection) is a good candidate for the DSD and will take full advantage of the price/performance characteristics offered by the DSD.

References and Related Materials

• “A Melting Pot of Programming Languages,” Richard Shaler, MC, February 2000

• “Create a Real-time AS/400-Domino Connection with DECS,” Richard Shaler, MC, May 2000

• Domino for AS/400 home page: www.as400.ibm.com

• “Lotus NotesPump: A High-speed Data Pump for Your Enterprise,” Daniel Green, MC, April 1999

• The Dedicated Server for Domino home page: www.as400.ibm.com/domino/dsd.htm


Domino Applications

LEI LSX LC LS:DO @DB Domino Code

JDBC, Java Toolbox for the AS/400 DECS

Domino DB2 Connector

SQL CLI

Non-Domino Code

DRDA

Figure 1: Common techniques for integrating Domino applications with DB2 UDB files are classified as Domino processing or non-Domino processing.

Evaluating_Appropriate_Workloads_for_the_AS-_400e...07-00.png 477x316

Performance Explorer Report 4/1/00 05:34:20

Profile Information Page 3

Library Section

Library . . : DOMINO501
Member. . . : TPROF01

Description : T&T application - 30 users over DRDA to Qbert

Histogram Hit Hit Cum Start Map Stmt Name

Cnt % % Addr Flag Numb
--

1080 2.0 19.4 FFFFFFFF8036CF84 MP 0 QU LIC |

|

1 0.0 83.0 263ED2E5D1004D34 MP 0 QNOTESINT
34712 63.4 83.0 0743B72BD507B960 MP 0 QNOTES

11 0.0 83.0 FFFFFFFFC10BE400 MP 0 QM LIC
48 0.1 83.1 FFFFFFFFC1208E38 MP 0 PX LIC
4 0.0 99.7 FFFFFFFFC1574E3C MP 0 M2 LIC

Figure 2: Analyzing data from a PEX report will help you determine whether an application is a good candidate for the DSD.

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: