Sidebar

TechTip: MERGE DB2 Data to SQL Server 2008

DB2
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Data replication chores are a snap with the new MERGE T-SQL statement.

 

Released in August 2008, SQL Server 2008 is the latest incarnation of Microsoft's flagship database platform. While there are some cool new features for developers, there is one for DB2 developers that deserves special mention: the new Transact SQL (T-SQL) MERGE statement. MERGE will be useful to any IBM i (aka AS/400, iSeries, System i) shop that needs to replicate DB2 for i (formerly known as DB2 for i5/OS and DB2/400) data to SQL Server. This includes shops that use SQL Server for reporting, data warehousing. and application data sharing.

What Is MERGE?

The MERGE statement is a single SQL statement that allows data to be synchronized between a "source" data source and a "target" data source. In this case, we'll consider a DB2 for i database table as the source and a SQL Server database table as the target. 

Traditionally, when writing a SQL Server routine to synchronize data between these two sources, a developer has a few possible approaches:

  1. Clear the target table and reload it entirely from the source table.
  2. For source tables that are "append only," simply copy the latest rows to the destination table.
  3. Write a combination of INSERT, UPDATE, and DELETE statements that keep the data between the two tables up to date.

Option 1 is clearly an option for small tables, such as code tables, but not so good for large tables. Option 2 works well for tables that only have new rows inserted, never updated (such as an inventory transaction history table). But what about when operating on larger tables that are not append only? Reloading entire large tables is often time-consuming on both servers. Therefore, option 3 is used to synchronize data between disparate data sources when a large amount of data is involved.

 

Let's look at an example of this last scenario that uses SQL Server's linked server facility to communicate with DB2. If you're not up to speed on linked servers and how to define them, see the references at the end of this tip. For this demonstration, I defined a linked server using the ODBC driver that comes with System i Access V6R1 (formerly known as iSeries Access) with service pack level SI31390 connecting to a V6R1 host. For those who prefer the IBM OLE DB providers over ODBC, I attempted to use the IBMDASQL driver but was not successful in getting it to work at this level.

 

For simplicity, say we have table QCUSTCDT with identical columns that exist on both DB2 for i and SQL Server. The primary key on both tables is CUSNUM. Further, we simply need to replicate all changes from DB2 to SQL Server and not the other way around. Using a linked server called DB2_ODBC (and utilizing the SQL naming convention in the ODBC setup), our T-SQL code would look like the following:

 

/* Update existing rows from DB2 */

Update Target

   Set LSTNAM = Source.LSTNAM,

       INIT   = Source.INIT,

       STREET = Source.STREET,

       CITY   = Source.CITY,

       STATE  = Source.STATE,

       ZIPCOD = Source.ZIPCOD,

       CDTLMT = Source.CDTLMT,

       CHGCOD = Source.CHGCOD,

       BALDUE = Source.BALDUE,

       CDTDUE = Source.CDTDUE

  From QCUSTCDT Target

  Join OpenQuery(DB2_ODBC,'Select * From QIWS.QCUSTCDT') Source

    On Target.CusNum=Source.CusNum

/* Insert new rows from DB2 */

Insert Into QCUSTCDT

Select *

  From OpenQuery(DB2_ODBC,'Select * From QIWS.QCUSTCDT') Source

 Where Not Exists

     (Select *

        From QCUSTCDT Target

       Where Target.CUSNUM=Source.CusNum)

/* Delete rows that are not in DB2 */

Delete From QCUSTCDT

 Where Not Exists

     (Select *

        From OpenQuery(DB2_ODBC,'Select * From QIWS.QCUSTCDT') Source

       Where QCUSTCDT.CUSNUM=Source.CusNum)

 

If this was a large table and had additional columns for timestamp added and updated, we'd probably want to add some additional criteria to the INSERT and UPDATE statements based on these values to limit the number of rows being processed.

 

MERGE is essentially a statement that allows a developer to specify a source (which can be a query, table, or view) and a target (which can be a table or updateable view) and define any combination of INSERT, UPDATE, and DELETE statements all rolled into one. Below is an example MERGE statement that replaces the three statements shown above:

 

MERGE dbo.QCUSTCDT AS Customer_Target

USING OpenQuery(DB2_ODBC,

      'Select * From QIWS.QCUSTCDT') AS Customer_Source

ON (Customer_Target.CusNum = Customer_Source.CusNum)

WHEN MATCHED THEN

UPDATE

   SET Customer_Target.LSTNAM=Customer_Source.LstNam,

 Customer_Target.Init=Customer_Source.Init,

       Customer_Target.Street=Customer_Source.Street,

       Customer_Target.City=Customer_Source.City,

       Customer_Target.State=Customer_Source.State,

       Customer_Target.ZipCod=Customer_Source.ZipCod,

       Customer_Target.CdtLmt=Customer_Source.CdtLmt,

       Customer_Target.ChgCod=Customer_Source.ChgCod,      

       Customer_Target.BalDue=Customer_Source.BalDue,

       Customer_Target.CdtDue=Customer_Source.CdtDue

WHEN NOT MATCHED BY TARGET THEN

INSERT (CUSNUM,LSTNAM,INIT,STREET,CITY,STATE,ZIPCOD,

        CDTLMT,CHGCOD,BALDUE,CDTDUE)

VALUES (Customer_Source.CUSNUM,Customer_Source.LSTNAM,

        Customer_Source.INIT,

        Customer_Source.STREET,Customer_Source.CITY,

        Customer_Source.STATE,Customer_Source.ZIPCOD,

        Customer_Source.CDTLMT,Customer_Source.CHGCOD,

        Customer_Source.BALDUE,Customer_Source.CDTDUE)

WHEN NOT MATCHED BY SOURCE THEN

    DELETE

OUTPUT $action,   /* Reserved Keyword */

       inserted.CusNum AS SourceCusNum,

       deleted.CusNum AS TargetCustNum

;      /* Merge needs a semicolon terminator! */

Let's go over the MERGE statement's clauses:

•·                 MERGE is a clause used to specify the name of a table or an updateable view as the target of the merge operation. Unfortunately, remote data sources are not allowed here.

•·                 USING specifies the data source for the merge, including tables, views, or a valid T-SQL query. In this case, the OpenQuery table function is used to retrieve data from DB2 for i.

•·                 ON is the search condition that correlates the data in the source and target tables. Generally, this should specify column name(s) that uniquely identify the rows between the data sources. The documentation warns against adding additional filtering criteria to this clause; save that task for the WHEN clauses!

•·                 WHEN MATCHED THEN is an optional clause that instructs MERGE how to handle a row in the target table that matches the current row in the source table. The options here are to issue an abbreviated UPDATE or DELETE command against the target.

•·                 WHEN NOT MATCHED BY TARGET THEN is an optional clause that instructs MERGE how to handle the situation when the source has a row that the target does not have. Generally, this is when an INSERT is done against the target. Note in the example the abbreviated SQL syntax used for an INSERT.

•·                 WHEN NOT MATCHED BY SOURCE THEN is an optional clause that instructs MERGE how to handle the case when there is a row in the target that is not in the source. Usually, if anything, a DELETE against the target is issued at this time. Please note that a search condition may be added to any of the WHEN clauses. Also, the abbreviated INSERT, UPDATE, and DELETE statements lack the table or view name (already specified in the MERGE clause) and lack criteria (done in the ON and WHEN clauses.)

•·                 OUTPUT is an elective clause that is useful for debugging or auditing. When OUTPUT is specified, each row that is modified by MERGE is added to a result set returned by the statement. As with T-SQL triggers, the "inserted" and "deleted" tables are special temporary tables that are used to track data from INSERT and UPDATE operations and UPDATE and DELETE operations, respectively. "$action" is a system function that contains a value of INSERT, UPDATE, or DELETE, depending on the action taken by MERGE. Further, OUTPUT can direct merge results to a table for auditing purposes.

 

MERGE is extremely versatile in that the source columns on the INSERT and UPDATE statements can contain expressions. Also, additional criteria can be specified before doing any of the data modification operations. For example, if a timestamp column is available, we can require that a target row is over one year old before deleting it. Or, before updating a row, SQL Server can be made to check for a change in the data before using resources to perform an update:

 

WHEN MATCHED AND

    (Customer_Target.LstNam<>Customer_Source.LstNam

  Or Customer_Target.Street<>Customer_Source.Street

  Or Customer_Target.ZipCod<>Customer_Source.ZipCod) THEN

 

In this example, data is updated only if the last name, street, or ZIP code change.

 

Unfortunately, when working with linked servers, MERGE is unidirectional. That is, it can merge data only to a local SQL Server table. Merging from a SQL Server source to a linked server target is not allowed. Therefore, MERGE should be used only when moving data in one direction--from DB2 to SQL Server.

Performance

To get an idea of how MERGE performs against individually written statements, I decided to export data from the SQL Server Adventure Works sample database to our trusty AS/400. I picked a single table called SalesOrderDetail that contains 121317 rows. The table has an 8-byte composite primary key and contains about 120 bytes per row. To give the MERGE statement some work to do, I...

•·         added about 30K bogus records that it would have to delete

•·         changed data in about 30K rows that it would have to update

•·         deleted about 30K rows that it would have to insert

•·         left about 30K alone that it should ignore

 

All in all, MERGE would have about 90K rows to update for this test. For three trials, MERGE averaged about 39 seconds (although neither the test workstation nor the AS/400 has impressive specs!). For comparison, I ran three trials against individual INSERT, UPDATE, and DELETE statements, which averaged 49 seconds. So in this case, MERGE offered about a 20 percent performance improvement when accessing a remote data source. 

 

The T-SQL statements used in the trial are shown at the end of this tip.

MERGE Ahead!

101008SansoterraMerge.JPG

MERGE statements should prove easier to maintain and debug. If a column is added to the primary key of your merge tables, for instance, you only need to change the correlation once in the ON clause instead of changing it in multiple statements. Also, since MERGE is T-SQL-based, you don't have to worry about the complexity of adding an integration services package to move the data. Finally, MERGE offers better performance over writing individual distributed query SQL statements.  When moving data from DB2 to SQL Server, MERGE is an efficient, easy way to go.

References

Trial T-SQL Statements

--T-SQL Script comparing MERGE vs distributed query statement performance

--

-- Step 1 - Mangle the data to give MERGE/statements

--          something to do.

--          (assumes SQL Server table is sync'd with

--          AS/400 table to begin with)

--

--

-- Delete every fourth row to give Merge

-- something to Insert

--

Delete

  From Sales.SalesOrderDetail

 Where (SalesOrderDetailID % 4)=0

GO

--

-- Update a few columns on every 3rd row

-- to force Merge to update existing data

--

Update Sales.SalesOrderDetail

   Set OrderQty=OrderQty+5,

       CarrierTrackingNumber=CarrierTrackingNumber+''

 Where (SalesOrderDetailID % 3)=0

GO

--

-- Insert some bad rows that will have to be

-- deleted by Mere

--

SET IDENTITY_INSERT Sales.SalesOrderDetail ON

GO

INSERT INTO Sales.SalesOrderDetail

(SalesOrderID,SalesOrderDetailId,CarrierTrackingNumber,

 OrderQty,ProductID,SpecialOfferID,

 UnitPrice,UnitPriceDiscount,rowguid,

 ModifiedDate)

Select -SalesOrderID,SalesOrderDetailId,CarrierTrackingNumber,

       OrderQty,ProductID,SpecialOfferID,

       UnitPrice,UnitPriceDiscount,rowguid,

       ModifiedDate

  From Sales.SalesOrderDetail

 Where SalesOrderDetailID % 2=0

GO

SET IDENTITY_INSERT Sales.SalesOrderDetail OFF

GO

--

-- Step 2 -- OK Merge, do your stuff!

--

-- Change the updated rows, delete the new bad rows

-- and re-insert the deleted rows

--

SET IDENTITY_INSERT Sales.SalesOrderDetail ON

GO

MERGE Sales.SalesOrderDetail AS Target

USING OpenQuery(DB2_ODBC,

      'Select * From AdvWorks.SalesOrderDetail') AS Source

ON (Target.SalesOrderId=Source.SalesOrderId

And Target.SalesOrderDetailId=Source.SalesOrderDetailId)

WHEN MATCHED AND

      (Target.CarrierTrackingNumber<>Source.CarrierTrackingNumber

    Or Target.OrderQty<>Source.OrderQty

    Or Target.ProductID<>Source.ProductID

    Or Target.SpecialOfferID<>Source.SpecialOfferID

    Or Target.UnitPrice<>Source.UnitPrice

    Or Target.UnitPriceDiscount<>Source.UnitPriceDiscount)THEN

UPDATE

   SET CarrierTrackingNumber = Source.CarrierTrackingNumber,

       OrderQty = Source.OrderQty,

       ProductID = Source.ProductID,

       SpecialOfferID = Source.SpecialOfferID,

       UnitPrice = Source.UnitPrice,

       UnitPriceDiscount = Source.UnitPriceDiscount,

       rowguid = Source.rowguid,

       ModifiedDate = Source.ModifiedDate

WHEN NOT MATCHED BY TARGET THEN

INSERT

(SalesOrderID,SalesOrderDetailId,CarrierTrackingNumber,

 OrderQty,ProductID,SpecialOfferID,

 UnitPrice,UnitPriceDiscount,rowguid,

 ModifiedDate)

VALUES

(Source.SalesOrderID,Source.SalesOrderDetailId,

 Source.CarrierTrackingNumber,Source.OrderQty,

 Source.ProductID,Source.SpecialOfferID,

 Source.UnitPrice,Source.UnitPriceDiscount,

 Source.rowguid,Source.ModifiedDate)

WHEN NOT MATCHED BY SOURCE THEN

    DELETE

;

SET IDENTITY_INSERT Sales.SalesOrderDetail OFF

GO

-- TRIAL 1 -- 41 SECONDS

-- TRIAL 2 -- 36 SECONDS

-- TRIAL 3 -- 39 SECONDS

-- ~ 39 seconds

--

-- Step 3 - Use individual statements to

--          synchronize data

--

-- (The data in the SQL Server table needs to be

--  reset again - run Step 1)

--

-- Using individual statements, change the updated rows,

-- delete the new bad rows and insert the deleted rows

--

UPDATE Target

   SET CarrierTrackingNumber = Source.CarrierTrackingNumber,

       OrderQty = Source.OrderQty,

       ProductID = Source.ProductID,

       SpecialOfferID = Source.SpecialOfferID,

       UnitPrice = Source.UnitPrice,

       UnitPriceDiscount = Source.UnitPriceDiscount,

       rowguid = Source.rowguid,

       ModifiedDate = Source.ModifiedDate

  From Sales.SalesOrderDetail Target

  Join OpenQuery(DB2_ODBC,'Select * From ADVWORKS.SalesOrderDetail') Source

    On Target.SalesOrderDetailID=Source.SalesOrderDetailID

   And Target.SalesOrderID=Source.SalesOrderID

 Where (Target.CarrierTrackingNumber<>Source.CarrierTrackingNumber

    Or Target.OrderQty<>Source.OrderQty

    Or Target.ProductID<>Source.ProductID

    Or Target.SpecialOfferID<>Source.SpecialOfferID

    Or Target.UnitPrice<>Source.UnitPrice

    Or Target.UnitPriceDiscount<>Source.UnitPriceDiscount)

SET IDENTITY_INSERT Sales.SalesOrderDetail ON

GO

INSERT INTO Sales.SalesOrderDetail

(SalesOrderID,SalesOrderDetailId,CarrierTrackingNumber,

 OrderQty,ProductID,SpecialOfferID,

 UnitPrice,UnitPriceDiscount,rowguid,

 ModifiedDate)

Select SalesOrderID,SalesOrderDetailId,CarrierTrackingNumber,

       OrderQty,ProductID,SpecialOfferID,

       UnitPrice,UnitPriceDiscount,rowguid,

       ModifiedDate

  From OpenQuery(DB2_ODBC,'Select * From ADVWORKS.SalesOrderDetail ORDER BY SalesOrderDetailId') Source

 Where Not Exists

     (Select *

        From Sales.SalesOrderDetail Target

       Where Target.SalesOrderDetailID=Source.SalesOrderDetailID

         And Target.SalesOrderID=Source.SalesOrderID)

  Order By SalesOrderDetailId

GO

SET IDENTITY_INSERT Sales.SalesOrderDetail OFF

GO

Delete From Sales.SalesOrderDetail

 Where Not Exists

     (Select *

        From OpenQuery(DB2_ODBC,'Select SalesOrderId,SalesOrderDetailId

                                 From ADVWORKS.SalesOrderDetail') Source

       Where Sales.SalesOrderDetail.SalesOrderDetailID=Source.SalesOrderDetailID

         And Sales.SalesOrderDetail.SalesOrderID=Source.SalesOrderID)

-- trial 1 - 47 seconds

-- trial 2 - 51 seconds

-- trial 3 - 49 seconds

-- Avg - 49 seconds

 

 

Michael Sansoterra is a DBA for Broadway Systems in Grand Rapids, Michigan. He can be contacted at sqlsleuth@gmail.com.


BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

RESOURCE CENTER

  • WHITE PAPERS

  • WEBCAST

  • TRIAL SOFTWARE

  • White Paper: Node.js for Enterprise IBM i Modernization

    SB Profound WP 5539

    If your business is thinking about modernizing your legacy IBM i (also known as AS/400 or iSeries) applications, you will want to read this white paper first!

    Download this paper and learn how Node.js can ensure that you:
    - Modernize on-time and budget - no more lengthy, costly, disruptive app rewrites!
    - Retain your IBM i systems of record
    - Find and hire new development talent
    - Integrate new Node.js applications with your existing RPG, Java, .Net, and PHP apps
    - Extend your IBM i capabilties to include Watson API, Cloud, and Internet of Things


    Read Node.js for Enterprise IBM i Modernization Now!

     

  • Profound Logic Solution Guide

    SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation.
    Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects.
    The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the companyare not aligned with the current IT environment.

    Get your copy of this important guide today!

     

  • 2022 IBM i Marketplace Survey Results

    Fortra2022 marks the eighth edition of the IBM i Marketplace Survey Results. Each year, Fortra captures data on how businesses use the IBM i platform and the IT and cybersecurity initiatives it supports.

    Over the years, this survey has become a true industry benchmark, revealing to readers the trends that are shaping and driving the market and providing insight into what the future may bring for this technology.

  • Brunswick bowls a perfect 300 with LANSA!

    FortraBrunswick is the leader in bowling products, services, and industry expertise for the development and renovation of new and existing bowling centers and mixed-use recreation facilities across the entertainment industry. However, the lifeblood of Brunswick’s capital equipment business was running on a 15-year-old software application written in Visual Basic 6 (VB6) with a SQL Server back-end. The application was at the end of its life and needed to be replaced.
    With the help of Visual LANSA, they found an easy-to-use, long-term platform that enabled their team to collaborate, innovate, and integrate with existing systems and databases within a single platform.
    Read the case study to learn how they achieved success and increased the speed of development by 30% with Visual LANSA.

     

  • Progressive Web Apps: Create a Universal Experience Across All Devices

    LANSAProgressive Web Apps allow you to reach anyone, anywhere, and on any device with a single unified codebase. This means that your applications—regardless of browser, device, or platform—instantly become more reliable and consistent. They are the present and future of application development, and more and more businesses are catching on.
    Download this whitepaper and learn:

    • How PWAs support fast application development and streamline DevOps
    • How to give your business a competitive edge using PWAs
    • What makes progressive web apps so versatile, both online and offline

     

     

  • The Power of Coding in a Low-Code Solution

    LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed.
    Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

    • Discover the benefits of Low-code's quick application creation
    • Understand the differences in model-based and language-based Low-Code platforms
    • Explore the strengths of LANSA's Low-Code Solution to Low-Code’s biggest drawbacks

     

     

  • Why Migrate When You Can Modernize?

    LANSABusiness users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.
    In this white paper, you’ll learn how to think of these issues as opportunities rather than problems. We’ll explore motivations to migrate or modernize, their risks and considerations you should be aware of before embarking on a (migration or modernization) project.
    Lastly, we’ll discuss how modernizing IBM i applications with optimized business workflows, integration with other technologies and new mobile and web user interfaces will enable IT – and the business – to experience time-added value and much more.

     

  • UPDATED: Developer Kit: Making a Business Case for Modernization and Beyond

    Profound Logic Software, Inc.Having trouble getting management approval for modernization projects? The problem may be you're not speaking enough "business" to them.

    This Developer Kit provides you study-backed data and a ready-to-use business case template to help get your very next development project approved!

  • What to Do When Your AS/400 Talent Retires

    FortraIT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators is small.

    This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn:

    • Why IBM i skills depletion is a top concern
    • How leading organizations are coping
    • Where automation will make the biggest impact

     

  • Node.js on IBM i Webinar Series Pt. 2: Setting Up Your Development Tools

    Profound Logic Software, Inc.Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. In Part 2, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Attend this webinar to learn:

    • Different tools to develop Node.js applications on IBM i
    • Debugging Node.js
    • The basics of Git and tools to help those new to it
    • Using NodeRun.com as a pre-built development environment

     

     

  • Expert Tips for IBM i Security: Beyond the Basics

    SB PowerTech WC GenericIn this session, IBM i security expert Robin Tatam provides a quick recap of IBM i security basics and guides you through some advanced cybersecurity techniques that can help you take data protection to the next level. Robin will cover:

    • Reducing the risk posed by special authorities
    • Establishing object-level security
    • Overseeing user actions and data access

    Don't miss this chance to take your knowledge of IBM i security beyond the basics.

     

     

  • 5 IBM i Security Quick Wins

    SB PowerTech WC GenericIn today’s threat landscape, upper management is laser-focused on cybersecurity. You need to make progress in securing your systems—and make it fast.
    There’s no shortage of actions you could take, but what tactics will actually deliver the results you need? And how can you find a security strategy that fits your budget and time constraints?
    Join top IBM i security expert Robin Tatam as he outlines the five fastest and most impactful changes you can make to strengthen IBM i security this year.
    Your system didn’t become unsecure overnight and you won’t be able to turn it around overnight either. But quick wins are possible with IBM i security, and Robin Tatam will show you how to achieve them.

  • Security Bulletin: Malware Infection Discovered on IBM i Server!

    SB PowerTech WC GenericMalicious programs can bring entire businesses to their knees—and IBM i shops are not immune. It’s critical to grasp the true impact malware can have on IBM i and the network that connects to it. Attend this webinar to gain a thorough understanding of the relationships between:

    • Viruses, native objects, and the integrated file system (IFS)
    • Power Systems and Windows-based viruses and malware
    • PC-based anti-virus scanning versus native IBM i scanning

    There are a number of ways you can minimize your exposure to viruses. IBM i security expert Sandi Moore explains the facts, including how to ensure you're fully protected and compliant with regulations such as PCI.

     

     

  • Encryption on IBM i Simplified

    SB PowerTech WC GenericDB2 Field Procedures (FieldProcs) were introduced in IBM i 7.1 and have greatly simplified encryption, often without requiring any application changes. Now you can quickly encrypt sensitive data on the IBM i including PII, PCI, PHI data in your physical files and tables.
    Watch this webinar to learn how you can quickly implement encryption on the IBM i. During the webinar, security expert Robin Tatam will show you how to:

    • Use Field Procedures to automate encryption and decryption
    • Restrict and mask field level access by user or group
    • Meet compliance requirements with effective key management and audit trails

     

  • Lessons Learned from IBM i Cyber Attacks

    SB PowerTech WC GenericDespite the many options IBM has provided to protect your systems and data, many organizations still struggle to apply appropriate security controls.
    In this webinar, you'll get insight into how the criminals accessed these systems, the fallout from these attacks, and how the incidents could have been avoided by following security best practices.

    • Learn which security gaps cyber criminals love most
    • Find out how other IBM i organizations have fallen victim
    • Get the details on policies and processes you can implement to protect your organization, even when staff works from home

    You will learn the steps you can take to avoid the mistakes made in these examples, as well as other inadequate and misconfigured settings that put businesses at risk.

     

     

  • The Power of Coding in a Low-Code Solution

    SB PowerTech WC GenericWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed.
    Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

    • Discover the benefits of Low-code's quick application creation
    • Understand the differences in model-based and language-based Low-Code platforms
    • Explore the strengths of LANSA's Low-Code Solution to Low-Code’s biggest drawbacks

     

     

  • Node Webinar Series Pt. 1: The World of Node.js on IBM i

    SB Profound WC GenericHave you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.
    Part 1 will teach you what Node.js is, why it's a great option for IBM i shops, and how to take advantage of the ecosystem surrounding Node.
    In addition to background information, our Director of Product Development Scott Klement will demonstrate applications that take advantage of the Node Package Manager (npm).
    Watch Now.

  • The Biggest Mistakes in IBM i Security

    SB Profound WC Generic The Biggest Mistakes in IBM i Security
    Here’s the harsh reality: cybersecurity pros have to get their jobs right every single day, while an attacker only has to succeed once to do incredible damage.
    Whether that’s thousands of exposed records, millions of dollars in fines and legal fees, or diminished share value, it’s easy to judge organizations that fall victim. IBM i enjoys an enviable reputation for security, but no system is impervious to mistakes.
    Join this webinar to learn about the biggest errors made when securing a Power Systems server.
    This knowledge is critical for ensuring integrity of your application data and preventing you from becoming the next Equifax. It’s also essential for complying with all formal regulations, including SOX, PCI, GDPR, and HIPAA
    Watch Now.

  • Comply in 5! Well, actually UNDER 5 minutes!!

    SB CYBRA PPL 5382

    TRY the one package that solves all your document design and printing challenges on all your platforms.

    Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product.

    Request your trial now!

  • Backup and Recovery on IBM i: Your Strategy for the Unexpected

    FortraRobot automates the routine tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:
    - Simplified backup procedures
    - Easy data encryption
    - Save media management
    - Guided restoration
    - Seamless product integration
    Make sure your data survives when catastrophe hits. Try the Robot Backup and Recovery Solution FREE for 30 days.

  • Manage IBM i Messages by Exception with Robot

    SB HelpSystems SC 5413Managing messages on your IBM i can be more than a full-time job if you have to do it manually. How can you be sure you won’t miss important system events?
    Automate your message center with the Robot Message Management Solution. Key features include:
    - Automated message management
    - Tailored notifications and automatic escalation
    - System-wide control of your IBM i partitions
    - Two-way system notifications from your mobile device
    - Seamless product integration
    Try the Robot Message Management Solution FREE for 30 days.

  • Easiest Way to Save Money? Stop Printing IBM i Reports

    FortraRobot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing.
    Manage your reports with the Robot Report Management Solution. Key features include:

    - Automated report distribution
    - View online without delay
    - Browser interface to make notes
    - Custom retention capabilities
    - Seamless product integration
    Rerun another report? Never again. Try the Robot Report Management Solution FREE for 30 days.

  • Hassle-Free IBM i Operations around the Clock

    SB HelpSystems SC 5413For over 30 years, Robot has been a leader in systems management for IBM i.
    Manage your job schedule with the Robot Job Scheduling Solution. Key features include:
    - Automated batch, interactive, and cross-platform scheduling
    - Event-driven dependency processing
    - Centralized monitoring and reporting
    - Audit log and ready-to-use reports
    - Seamless product integration
    Scale your software, not your staff. Try the Robot Job Scheduling Solution FREE for 30 days.