26
Fri, Apr
1 New Articles

DB2 Integration with SQL Server 2005, Part 2: CLR Integration

SQL
Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

The introduction of SQL Server 2005 has brought some exciting new features for developers. For instance, the ability to integrate heterogeneous systems with SQL Server in real time keeps getting easier. Part I of this series covered the power and ease of using linked servers as a means of programmatically providing real-time linkage to a remote database via ODBC or OLE DB. Although the options are limitless, this article demonstrates the power of using the .NET Common Language Runtime (CLR) as a means of integrating with remote databases (in particular, DB2 UDB for the iSeries).

What Is CLR?

CLR is the controlled environment used to execute code written in a .NET-compatible language such as C#, VB.NET, or COBOL.NET. All .NET programs—regardless of language—are compiled into something called the Microsoft Intermediate Language (MSIL). It is this intermediate language code that is actually run in the CLR. The concept employed is similar to the byte code that is run by the Java Virtual Machine (JVM).

How Does SQL Server 2005 Use the CLR?

Simply put, SQL Server 2005 (SQL Server hereafter) has the ability to execute compiled code written in any .NET language. This means SQL routines (usually written in T-SQL) such as stored procedures, scalar user-defined functions, and table-valued user-defined functions can be written in a .NET language.

Writing database server code in a .NET language has the following advantages:

  • Allows code reuse—The database logic and application logic now have access to the exact same code. In the past, an application would have code written in a high-level language, and the database server would often have duplicated code written in SQL.
  • Removes SQL limitations—SQL dialects work well with structured database data within the confines of its own server. However, database logic often demands data from other sources, such as external databases on other platforms, various text files, LDAP queries, etc. Allowing the use of .NET code gives the database developer the ability to overcome traditional SQL limitations.

Many of you by now realize that Microsoft is playing catch-up to something DB2 on the iSeries has been able to do for a long time creating additional database routines using high-level language logic written in RPG, COBOL, C, Java, and others. Nevertheless, this is a welcome addition to SQL Server's capabilities.

The Need for CLR

Recall that linked servers blur the dividing line between SQL Server and other databases by allowing SQL Server to issue Data Definition Language (DDL) and Data Manipulation Language (DML) statements against remote database tables as though they were part of SQL Server. All of this can be done within the comfort of easy-to-understand T-SQL statements.

With linked servers offering flexibility and ease of programming, why would we need to write a CLR routine? The answer lies where linked server capabilities have a few shortcomings:

  • Error-handling tasks, such as writing detailed info to a log, can be done more gracefully and thoroughly using .NET routines.
  • Dynamically changing remote database environments can be a chore with linked servers. For instance, to point a linked server to a different DB2 database or different machine altogether requires writing ugly dynamic SQL, using synonyms or dropping the linked server and creating it with different attributes. On the other hand, connection strings can be changed easily in .NET.
  • Caching the results of parameterized queries or stored procedures often requires using the somewhat clumsy Insert-Exec T-SQL construct in order to save results from the remote database in a table (usually temporary) that T-SQL can use. This step is unnecessary with .NET routines.
  • Linked servers do not offer the versatility of massaging data from a remote data source before handing it to SQL Server as compared to a CLR routine.
  • Remote data access routines may need the benefit of business logic or other routines only available within the .NET realm.

To demonstrate CLR integration to a remote database, I'll create a table-valued user-defined function and a stored procedure written in VB.NET. These routines will call a DB2 query and return the results to SQL Server as though the data came from a local SQL Server table. (Note that both of these concepts can be accomplished similarly in DB2 on the iSeries using Java as demonstrated in "Query Remote Database Tables from the iSeries Using SQL and Java" and "Execute Remote Stored Procedures from the iSeries").

Software Requirements

The examples require iSeries Access V5R3 (or higher with the latest service pack) to be installed along with the DB2 UDB for iSeries .NET managed provider component (the appropriate ODBC or OLE DB providers can be substituted as well). This must be installed on the same machine as SQL Server.

Visual Studio 2005 (hereafter VS) is also required (not necessarily on the same machine) along with the SQL Server client tools (which will provide VS the templates for creating SQL Server routines and deploying them automatically). Understand that these procedures can be created outside of VS using Notepad, but the deployment and compilation instructions would be a chore to describe!

Setting Up SQL Server 2005

Since .NET code can do just about anything, including destructive tasks, for security reasons you must flip a switch in order to enable SQL Server 2005 to run .NET code. This feature can be enabled by starting the SQL Server Surface Area Configuration utility. Click on Surface Area Configuration for Features, expand the database engine node, click on CLR Integration, and then check the Enable CLR Integration box.

Alternatively, you can execute the following T-SQL code:

EXEC sp_configure 'show advanced options', 1
GO
RECONFIGURE
GO
sp_configure 'clr enabled', 1
GO
RECONFIGURE
GO

Next, you'll need to choose an existing database or create a new database to use for this exercise. I simply used the AdventureWorks sample database. Start a SQL Server management studio session (this is a unified replacement for the old Enterprise Manager and Query Analyzer tools) and start a new query session that is configured to use the database you've chosen.

Once you've decided on a database, mark it as "trustworthy" using the ALTER DATABASE statement so that it can reference "unsafe" .NET code (more on this in a minute):

Alter Database AdventureWorks
Set TRUSTWORTHY On

To do these examples, we'll use the DB2 UDB for iSeries .NET managed provider, so we'll need to add a reference to this assembly. The references to code in external assemblies for a SQL Server CLR project are really enabled by the assemblies referenced within SQL Server. Therefore, you may not add a reference to an assembly from Visual Studio within a SQL Server project as you would in other projects. To register an assembly that SQL Server can use, you must issue the CREATE ASSEMBLY SQL statement (this is done in SQL Server management studio within the database you've chosen).

In case you're new to .NET, an "assembly" refers to executable .NET code stored as a DLL file (somewhat similar to the concept of an ILEservice program). In the example below, IBM.Data.DB2.iSeries.dll is a reusable .NET assembly distributed by IBM.

CREATE ASSEMBLY IBMDataDB2iSeries
FROM 'C:Program FilesIBMClient AccessIBM.Data.DB2.iSeries.dll'
WITH PERMISSION_SET = unsafe;

This misleading SQL doesn't really create or compile any code; it only registers existing code for use by SQL Server. The CREATE/DROP ASSEMBLY syntax was implemented for consistency with other SQL statements. The FROM clause contains the path to the qualified path name to the actual assembly (which may vary on your computer, depending on your iSeries Access installation directory).

Finally, the PERMISSION_SET allows three values: SAFE, EXTERNAL ACCESS, and UNSAFE. SAFE indicates that the code requires no access outside of SQL Server. EXTERNAL ACCESS allows the code to access resources such as files, networks, environment variables, and the registry. The UNSAFE setting allows calls to unmanaged code (e.g., COM objects) and other code that is outside the control of the .NET framework. The current DB2 provider assembly requires this setting. As the Microsoft documentation notes, grant the UNSAFE setting only to highly trusted assemblies; otherwise, your system security may be compromised or your system may become unstable. After executing this statement, SQL Server will issue a warning about relative .NET framework levels (because the IBM DB2 for iSeries assembly was written for V1.0 of the framework). You can ignore this error.

Fire Up Visual Studio 2005

After starting a new VS session, create a new project. Select VB.NET as the language (expand this node) and choose the Database project type. Click on SQL Server Project in the templates window and assign a project name (such as MCPressDemo) and a solution name. The Add Database Reference window will appear. Select your database from the list or click the Add New Reference button to select your SQL Server and database name (again, I used AdventureWorks). If you're prompted with a message about debugging CLR code, choose Yes if this is a server you can tinker with (as debugging can impact performance).

If you don't have the SQL Server template available in Visual Studio, that could indicate an issue with having the SQL Server client components installed incorrectly.

Once the project is open, choose Properties from the Project menu and then the Database tab. Set the permission level setting to Unsafe because the project will contain a reference to the IBM DB2 assembly. Next, choose Add Reference from the Project menu. The list of references will be limited to some default .NET framework assemblies and any assemblies registered to the database with the CREATE ASSEMBLY command. If you successfully ran the above CREATE ASSEMBLY statement, IBMDataDB2iSeries should appear in the list. Select this assembly and click OK. Your project code can now reference this library.

Stored Procedure Example

To create a .NET stored procedure, choose Add Stored Procedure from the Project menu and assign a name. VS will create stub code for the stored procedure including default framework references, an attribute indicating that the code will be used as a stored procedure, and a shared (aka static) method with the same name as the stored procedure. Simply fill in .NET logic in the stub code and then build and deploy the project. VS will take care of registering the stored procedure in SQL Server!

Stored procedures can do many things: execute logic, accept and return parameters, and return one or more result sets. Figure 1 shows sample stored procedure spDB2Demo that demonstrates these features. This procedure issues a basic SELECT query to DB2 and passes the DB2 result set back to SQL Server (the caller will not know that .NET code is actually going to DB2 to get the results). It also features an output parameter that returns the number of rows retrieved from DB2.

Imports System
Imports System.Data
Imports System.Data.Sql
Imports System.Data.SqlClient
Imports System.Data.SqlTypes
Imports Microsoft.SqlServer.Server
'
' NOTES:
'
' This project's assembly (because it uses the "UNSAFE"
' DB2 for iSeries assembly) must also be declared as unsafe
'
' You must register the DB2 UDB for iSeries .NET managed
' provider with the CREATE ASSEMBLY command
'
' CREATE ASSEMBLY IBMDataDB2
' FROM 'C:Program FilesIBMClient AccessIBM.Data.DB2.iSeries.dll'
' WITH PERMISSION_SET = unsafe;
'
' Further, once the assembly is available, you must add a
' reference to it in the .NET project so that it can be
' used in the code.
'
Imports IBM.Data.DB2.iSeries
'
' CLR Stored Procedure example that returns a resultset
' and an output parameter.
'
Partial Public Class StoredProcedures
    <Microsoft.SqlServer.Server.SqlProcedure()> _
    Public Shared Sub SPDB2Demo(ByRef parmRowCount As SqlInt32)
        '
        ' Number of rows returned
        '
        Dim intRowCount As Integer = 0
        '
        ' The SQL MetaData objects are used to define the attributes
        ' of the columns being returned in the result set.  In this
        ' example, the column definitions here match the column
        ' definitions in the DB2 QCUSTCDT table.
        '
        Dim sqlMd(10) As SqlMetaData
        sqlMd(0) = New SqlMetaData("CUSNUM", SqlDbType.Decimal, 6, 0)
        sqlMd(1) = New SqlMetaData("LSTNAM", SqlDbType.Char, 8)
        sqlMd(2) = New SqlMetaData("INIT", SqlDbType.Char, 3)
        sqlMd(3) = New SqlMetaData("STREET", SqlDbType.Char, 13)
        sqlMd(4) = New SqlMetaData("CITY", SqlDbType.Char, 6)
        sqlMd(5) = New SqlMetaData("STATE", SqlDbType.Char, 2)
        sqlMd(6) = New SqlMetaData("ZIPCOD", SqlDbType.Decimal, 5, 0)
        sqlMd(7) = New SqlMetaData("CDTLMT", SqlDbType.Decimal, 4, 0)
        sqlMd(8) = New SqlMetaData("CHGCOD", SqlDbType.Decimal, 1, 0)
        sqlMd(9) = New SqlMetaData("BALDUE", SqlDbType.Decimal, 6, 2)
        sqlMd(10) = New SqlMetaData("CDTDUE", SqlDbType.Decimal, 6, 2)
        '
        ' The column information is passed to an SQLDataRecord
        ' which will define the result set as a whole
        '
        Dim sqlR As SqlDataRecord = New SqlDataRecord(sqlMd)
        '
        ' The SqlContext object is used to identify the caller's
        ' context (or connection.)
        '
        ' Result sets are passed back to SQL Server via a 
        ' database connection.  Since this code will run
        ' in the context of an existing SQL Server connection
        ' we'll just use the existing connection.
        '
        ' Pass the result set structure to SQL Server
        '
        SqlContext.Pipe.SendResultsStart(sqlR)
        '
        ' Load the result set with DB2 data
        '
        Try
            '
            ' Connect to the iSeries, System i5, AS/400
            '
            Dim connDB2 As iDB2Connection = New 

iDB2Connection("DATASOURCE=MyAS400;USERID=userid;PASSWORD=password")
            connDB2.Open()
            '
            ' Execute a SELECT statement and retrieve the results
            ' in a data reader
            '
            Dim cmdDB2 As iDB2Command = connDB2.CreateCommand()
            cmdDB2.CommandText = "Select * From QIWS.QCUSTCDT"
            Dim drDB2 As iDB2DataReader = cmdDB2.ExecuteReader()
            '
            ' Populate result set rows (in this example, the
            ' stored procedure result sets is a representation
            ' of the data in the QIWS/QCUSTCDT sample table
            ' on the iSeries.)
            '
            While drDB2.Read
                sqlR.SetDecimal(0, drDB2.GetiDB2Decimal(0))
                sqlR.SetString(1, drDB2.GetString(1))
                sqlR.SetString(2, drDB2.GetString(2))
                sqlR.SetString(3, drDB2.GetString(3))
                sqlR.SetString(4, drDB2.GetString(4))
                sqlR.SetString(5, drDB2.GetString(5))
                sqlR.SetDecimal(6, drDB2.GetiDB2Decimal(6))
                sqlR.SetDecimal(7, drDB2.GetiDB2Decimal(7))
                sqlR.SetDecimal(8, drDB2.GetiDB2Decimal(8))
                sqlR.SetDecimal(9, drDB2.GetiDB2Decimal(9))
                sqlR.SetDecimal(10, drDB2.GetiDB2Decimal(10))
                SqlContext.Pipe.SendResultsRow(sqlR)
                intRowCount += 1
            End While
            '
            ' Close the DB2 Connection and pass the number 
            ' of rows retrieved as an output variable
            '
            drDB2.Close()
            parmRowCount = intRowCount

#If SendSecondResult Then
            '
            ' Signal end of the result set
            '
            SqlContext.Pipe.SendResultsEnd()
            '
            ' Theoretically, we can create a second result set
            ' and return it
            '
            Dim sqlResultSet2(2) As SqlMetaData
            sqlResultSet2(0) = New SqlMetaData("Field1", SqlDbType.Char, 10)
            sqlResultSet2(1) = New SqlMetaData("Field2", SqlDbType.Char, 50)
            sqlResultSet2(2) = New SqlMetaData("Field3", SqlDbType.Char, 20)
            sqlR = New SqlDataRecord(sqlResultSet2)

            SqlContext.Pipe.SendResultsStart(sqlR)
            drDB2 = cmdDB2.ExecuteReader()
            '
            ' Populate result set rows
            '
            While drDB2.Read
                sqlR.SetString(0, drDB2.GetString(0))
                sqlR.SetString(1, drDB2.GetString(1))
                sqlR.SetString(2, drDB2.GetString(2))
                SqlContext.Pipe.SendResultsRow(sqlR)
            End While
            drDB2.Close()
#End If
            cmdDB2.Dispose()
            connDB2.Close()
            connDB2.Dispose()
        Catch ex As iDB2Exception

            '
            ' Execptions can be handled in almost any way
            ' For simpilicty, the error message is sent 
            ' back through a character variable for ease
            ' of debugging.
            '
            sqlR.SetString(1, ex.Message)
            SqlContext.Pipe.SendResultsRow(sqlR)
        End Try
        '
        ' Signal end of the result set
        '
        SqlContext.Pipe.SendResultsEnd()

    End Sub
End Class

Figure 1: Stored procedures can execute logic, accept and return parameters, and return one or more result sets.

Adding parameters to a .NET stored procedure is as easy as adding parameters to the method's signature. Passing parameters by value (ByVal keyword) causes SQL Server to recognize these parameters as input-only. Passing parameters by reference causes SQL Server to treat them as input/output. When you change your method's parameter signature, the stored procedure signature registered in the database will change as well when the latest project changes are deployed to SQL Server. Parameters passed between SQL Server and .NET code should use data types available in the System.Data.SqlTypes library to allow for things like null compatibility.

Our next task is making the .NET code return a result set to SQL Server. Passing back information as a result set involves three main steps: opening a SQL Server connection, defining the result set's metadata, and writing the data one row at a time. (Incidentally, there are more ways to pass back information.)

Generally, since this CLR code is running in the context of a database connection, the existing connection is used as the pipe for returning results to SQL Server. Access to the existing connection's context is provided by the SqlContext object.

To indicate the structure of the result set, create an array of SqlMetaData objects. Each element of the array corresponds to a column name and its attributes in the result set. Next, create a variable with the type of Microsoft.SqlServer.Server.SqlDataRecord and pass the SqlMetaData array to its constructor. This record object will be passed to SQL Server via the SqlContext.Pipe.SendResultsStart and indicates to SQL Server that result set data will be coming in the specified format.

At this point in the sample, a DB2 data reader object is opened and iterated. Each row from DB2 is copied column by column into the data record object and then sent to SQL Server using the SqlContext.Pipe.SendResultsRow method. Oddly enough, this is very similar to how external DB2 user-defined table functions work. After all the rows are processed, the SqlContext.Pipe.SendResultsEnd method is called to let SQL Server know the end of the result set has come.

As illustrated in the sample code, additional result sets can be returned by repeating the process with a new or existing data record definition and executing the SqlContext.Pipe.SendResultsStart method.

When finished, sample T-SQL code to run the stored procedure will look like this:

Declare @NoRows Int
Exec SPDB2Demo @NoRows OUTPUT
Print @NoRows

While this stored procedure is a somewhat trivial example of getting data from DB2, the point is that coding a stored procedure in a .NET language allows almost any conceivable programming function to be done through SQL Server, including the most difficult data integration tasks.

Table-Valued User-Defined Function Example

A table-valued user-defined function can be thought of as a "virtual table." Instead of querying data from a database table, table function code supplies the database server data in a tabular format. Similar to stored procedures, table functions can receive input parameters and perform logic, but they return only a single result set and have no output parameter capability.

In the next example, we'll code a CLR table function to get its data from DB2. When finished, the following T-SQL statement will get its data from the DB2 QIWS/QCUSTCDT table on the iSeries:

select * from dbo.DB2QCustCdt()

Didn't we already do this with the stored procedure? Yes, but it's important to realize that table functions have advantages over stored procedures. In particular, if a result set needs to participate in a join to another table or be sorted dynamically with an Order By, then a table function is usually a better tool. In other words, the result set of a stored procedure can't be modified or easily used in a subsequent query, but the result of a table function can.

To create your own .NET coded table function in Visual Studio, choose Add User-Defined Function from the Project menu. Unfortunately, the supplied template code is for a scalar (single-value) user-defined function, and many modifications have to be done to convert it to a table function. Notice that stub code is generated and that the defined method name matches the object name of the table function in SQL Server (once it's deployed). Additionally, adding ByVal parameter references to the method signature equates to adding input parameters in the table function.

Figure 2 contains function DB2QCustCdt. The SqlFunction attribute is specified along with several properties needed to inform SQL Server of how the table function will be implemented. The table definition property, for instance, consists of the column names and SQL Server data types that the table function will return when executed. See the SQL Server help for a list of all the available properties and their roles.

Imports System
Imports System.Data
Imports System.Data.Sql
Imports System.Data.SqlTypes
Imports Microsoft.SqlServer.Server
Imports IBM.Data.DB2.iSeries            
'.NET Managed Provider for iSeries
Partial Public Class UserDefinedFunctions
    '
    ' A shared method with the same name as the 
    ' table valued function is required as an
    ' entry point to the SQL routine.  This
    ' method must return an IEnumerable
    ' object.
    '
    ' The SqlFunction attribute is used to define the
    ' characteristics of the function. This information
    ' allows Visual Studio to issue the appropriate
    ' CREATE FUNCTION or ALTER FUNCTION statement to 
    ' SQL Server when the project is deployed.
    '
    '
    ' The FillRowMethodName must be unique within the class
    '
    <SqlFunction(FillRowMethodName:="FillRowQCustCdt", _
                 IsDeterministic:=False, _
                 IsPrecise:=True, _
                 TableDefinition:="CUSNUM NUMERIC(6, 0), " + _
                                  "LSTNAM NCHAR(8), " + _
                                  "INIT   NCHAR(3), " + _
                                  "STREET NCHAR(13), " + _
                                  "CITY   NCHAR(6), " + _
                                  "STATE  NCHAR(2), " + _
                                  "ZIPCOD NUMERIC(5, 0), " + _
                                  "CDTLMT NUMERIC(4, 0), " + _
                                  "CHGCOD NUMERIC(1, 0), " + _
                                  "BALDUE NUMERIC(6, 2), " + _
                                  "CDTDUE NUMERIC(6, 2)")> _
    Public Shared Function DB2QCustCdt() As IEnumerable
        '
        ' Don't forget to add error checking
        ' 
        ' Fill in your system name, user id & password
        ' (of course you normally wouldn't store hard coded
        '  credentials in this manner)
        '
        ' This sample requires the DB2 for iSeries .NET managed provider
        ' (available with iSeries Access V5R3)
        '
        ' 1) Connect to the iSeries, System i5, AS/400
        '
        Dim connDB2 As iDB2Connection = New 

iDB2Connection("DATASOURCE=My_iSeries;USERID=userid;PASSWORD=password")
        connDB2.Open()
        '
        ' 2) Execute an SQL command and return the results as a data reader
        '
        Dim cmdDB2 As iDB2Command = connDB2.CreateCommand()
        cmdDB2.CommandText = "Select * From QIWS.QCUSTCDT"
        Dim drDB2 As iDB2DataReader = 

cmdDB2.ExecuteReader(CommandBehavior.CloseConnection)
        Return drDB2

    End Function
    '
    ' This method is called frequently as the IEnumerable object
    ' returned from DB2QCustCdt is enumerated. Each enumeration
    ' represents a row to be returned to SQL Server.
    '
    Public Shared Sub FillRowQCustCdt( _
                       ByVal row As Object, _
                       ByRef CUSNUM As SqlDecimal, _
                       ByRef LSTNAM As SqlString, _
                       ByRef INIT As SqlString, _
                       ByRef STREET As SqlString, _
                       ByRef CITY As SqlString, _
                       ByRef STATE As SqlString, _
                       ByRef ZIPCOD As SqlDecimal, _
                       ByRef CDTLMT As SqlDecimal, _
                       ByRef CHGCOD As SqlDecimal, _
                       ByRef BALDUE As SqlDecimal, _
                       ByRef CDTDUE As SqlDecimal)
        '
        ' SQL Reader objects pass type System.Data.Common.DbDataRecord
        ' in the row object
        '
        Dim columns As Common.DbDataRecord = CType(row, Common.DbDataRecord)
        CUSNUM = columns.GetDecimal(0)
        LSTNAM = columns.GetString(1)
        INIT = columns.GetString(2)
        STREET = columns.GetString(3)
        CITY = columns.GetString(4)
        STATE = columns.GetString(5)
        ZIPCOD = columns.GetDecimal(6)
        CDTLMT = columns.GetDecimal(7)
        CHGCOD = columns.GetDecimal(8)
        BALDUE = columns.GetDecimal(9)
        CDTDUE = columns.GetDecimal(10)
    End Sub
End Class

Figure 2: In function DB2QCustCdt, the SqlFunction attribute is specified along with several properties needed to inform SQL Server of how the table function will be implemented.

Notice that method DB2QCustCdt returns an object that implements the IEnumerable interface. This is a requirement for all table functions written in .NET code. Many .NET classes use this important interface, including arrays, collections, and data readers. When SQL Server gets a request to run the DB2QCustCdt table function, it will call the DB2QCustCdt method and expect to retrieve an "enumerable" object. Each enumerated object represents a row to be returned in the result set. In this case, the sample code will return an iDB2DataReader.

When SQL Server enumerates (i.e., processes item by item) each row returned by the data reader, it will need a little more help to map data from the IEnumerable object to parameters that represent the table function's columns. The FillRowMethodName property of the SqlFunction attribute defines the method SQL Server will call when breaking up each enumerated row object into distinct data columns. The signature of the method will be a row object (as an input parameter) followed by an output parameter for each column in the table function. Please note that the project will not deploy if the signature for this FillRowMethodName method does not match the table columns defined in the TableDefinition property.

In this example, method FillRowQCustCdt will be called for each row returned by the data reader. In the case of the data reader class, the row parameter will be an object of type System.Data.Common.DbDataRecord, which is fortunate, because it offers methods to extract individual data columns from the DataReader. Each output parameter is filled in from the data reader before the method ends. Each call to FillRowQCustCdt represents one row being parsed and returned to SQL Server.

A Step Further

Figure 3 shows an enhanced example of the table function. It calls a DB2 stored procedure with a criteria parameter instead of a simple Select. This figure contains the DB2 stored procedure, the .NET code, and T-SQL usage examples.

-- Enhanced CLR UDTF example using
-- DB2 stored procedure and
-- parameterized criteria
--
--
-- Here is how the CLR function is
-- used in T-SQL to return DB2
-- data.
--
select * from dbo.DB2QCustCdt('BALDUE>0')


--
-- DB2 Stored Procedure Definition
-- 
-- Build the SQL statement dynamically
--
Create Procedure xxxxx/spCustList(parmWhere In VarChar(512))  
Language SQL                                               
Dynamic Result Sets 1                                      
Set Option Commit=*None,DynUsrPrf=*Owner
Begin                                                      
    Declare SQLText VarChar(1024) Default '' Not Null;     
    Declare CustomerList Cursor With Return to Client      
        For CustomerListStmt;                              
                                                           
    Set SQLText='Select * From QIWS/QCustCdt ';            
    If parmWhere>'' Then                                   
        If UCase(Left(LTrim(parmWhere),5))='WHERE' Then    
            Set SQLText=SQLText||parmWhere;                
        Else                                               
            Set SQLText=SQLText||' Where '||parmWhere;     
        End If;                                            
    End If;                                                
                                                 
    Prepare CustomerListStmt From SQLText;       
                                                 
    Open CustomerList;                           
End            


--
-- Modified CLR Code for stored procedure 
-- access instead of simple SELECT
--
Imports System
Imports System.Data
Imports System.Data.Sql
Imports System.Data.SqlTypes
Imports Microsoft.SqlServer.Server
Imports IBM.Data.DB2.iSeries

Partial Public Class UserDefinedFunctions
    '
    ' The FillRowMethodName must be unique within the assembly
    '
    <SqlFunction(FillRowMethodName:="FillRowQCustCdtSP", _
                 IsDeterministic:=False, _
                 IsPrecise:=True, _
                 TableDefinition:="CUSNUM NUMERIC(6, 0), " + _
                                  "LSTNAM NCHAR(8), " + _
                                  "INIT   NCHAR(3), " + _
                                  "STREET NCHAR(13), " + _
                                  "CITY   NCHAR(6), " + _
                                  "STATE  NCHAR(2), " + _
                                  "ZIPCOD NUMERIC(5, 0), " + _
                                  "CDTLMT NUMERIC(4, 0), " + _
                                  "CHGCOD NUMERIC(1, 0), " + _
                                  "BALDUE NUMERIC(6, 2), " + _
                                  "CDTDUE NUMERIC(6, 2)")> _
    Public Shared Function DB2QCustCdtSP(ByVal parmWhere As SqlString) As IEnumerable
        '
        ' Add error checking
        ' 
        ' Fill in your system name, user id & password
        '
        ' This sample requires the DB2 for iSeries .NET managed provider
        ' (available with iSeries Access V5R3)
        '
        Dim connDB2 As iDB2Connection = New 

iDB2Connection("DATASOURCE=My_AS400;USERID=user;PASSWORD=password")
        Dim cmdDB2 As iDB2Command = connDB2.CreateCommand()
        connDB2.Open()
        ' 
        ' Replace your schema name in the stored procedure call
        '
        cmdDB2.CommandText = "Call xxxxx.spCustList('" + parmWhere.ToString() + "')"
        cmdDB2.CommandType = CommandType.Text

        Dim drDB2 As iDB2DataReader = 

cmdDB2.ExecuteReader(CommandBehavior.CloseConnection)
        Return drDB2
    End Function

    Public Shared Sub FillRowQCustCdtSP( _
                       ByVal row As Object, _
                       ByRef CUSNUM As SqlDecimal, _
                       ByRef LSTNAM As SqlString, _
                       ByRef INIT As SqlString, _
                       ByRef STREET As SqlString, _
                       ByRef CITY As SqlString, _
                       ByRef STATE As SqlString, _
                       ByRef ZIPCOD As SqlDecimal, _
                       ByRef CDTLMT As SqlDecimal, _
                       ByRef CHGCOD As SqlDecimal, _
                       ByRef BALDUE As SqlDecimal, _
                       ByRef CDTDUE As SqlDecimal)
        '
        ' SQL Reader objects return a type of System.Data.Common.DbDataRecord
        ' 
        Dim columns As Common.DbDataRecord = CType(row, Common.DbDataRecord)
        CUSNUM = columns.GetDecimal(0)          'CType(columns(0), SqlDecimal)
        LSTNAM = columns.GetString(1)           'CType(columns(1), SqlString)
        INIT = columns.GetString(2)             'CType(columns(2), SqlString)
        STREET = columns.GetString(3)           'CType(columns(3), SqlString)
        CITY = columns.GetString(4)             'CType(columns(4), SqlString)
        STATE = columns.GetString(5)            'CType(columns(5), SqlString)
        ZIPCOD = columns.GetDecimal(6)          'CType(columns(6), SqlDecimal)
        CDTLMT = columns.GetDecimal(7)          'CType(columns(7), SqlDecimal)
        CHGCOD = columns.GetDecimal(8)          'CType(columns(8), SqlDecimal)
        BALDUE = columns.GetDecimal(9)          'CType(columns(9), SqlDecimal)
        CDTDUE = columns.GetDecimal(10)         'CType(columns(10), SqlDecimal)
    End Sub
End Class

Figure 3: This enhanced example of the table function calls a DB2 stored procedure with a criteria parameter instead of a simple Select.

Data Type Cross- Reference

One thing to be aware of is the need for data type compatibility between .NET, DB2, and SQL Server. table below shows how to map DB2 data types to the equivalent .NET types and the resulting SQL Server type:

Data Type Compatibility Between .NET, DB2, and SQL Server
DB2 Data Type
.NET SQL Data Type
SQL Server data type
CHAR
SqlString
NCHAR
VARCHAR
SqlString
NVARCHAR
INTEGER
SqlInt32
INTEGER
SMALLINT
SqlInt16
SMALLINT
BIGINT
SqlInt64
BIGINT
DECIMAL
SqlDecimal
DECIMAL
NUMERIC
SqlDecimal
DECIMAL (or NUMERIC)
DATE
SqlDateTime
DATETIME
TIMESTAMP
SqlDateTime
DATETIME
DOUBLE
SqlDouble
FLOAT(53)


Notice that the DB2 FLOAT and TIME types are missing. I found DB2 Float to be problematic unless I mapped it to SqlDouble and then to Float(53) in SQL Server (although the expected type to use is REAL). Also, I had problems with the DB2 TIME data type. Mapping the TIME type to SqlDateTime implicitly converts to a time stamp without a date portion (which means the date portion is '01/01/0001'). An error occurs because, unlike DB2 time stamps, SQL Server's DATETIME cannot store this date value. It would probably be best to bring the TIME type into .NET as a string and then cast it back to a DATETIME value later.

Deploying the Project

When your .NET routines have been written, press F5 to deploy the project to SQL Server (or choose Build and then Deploy from the VS Build menu). The deployment will register the project's assembly with SQL Server. Also, the appropriate stored procedure and table function definitions based on the method signatures in the code will be registered in the database. If signatures change, VS will alter the SQL routines appropriately! With all the .NET coding required, at least we don't have to manually write CREATE PROCEDURE and CREATE FUNCTION statements!

.NET Routines or Linked Server Access?

Writing .NET routines requires more work than writing linked server routines. When deciding to use .NET code vs. a linked server, consider whether .NET offers a function (custom business logic, etc.) that is not accessible to the linked server. For instance, you may need to join data from a remote Web service with DB2 data before passing it to SQL Server—something that .NET will allow you to do.

However, this article just touches on the capabilities of CLR routines with respect to real-time DB2 integration. Aggregates and triggers can also be written in .NET code. The granularity and flexibility afforded to the programmer when writing .NET routines is unsurpassed. If your company uses any of the myriad of products that will run on SQL Server 2005, you now have additional tools to create real-time, seamless integration routines with your flagship iSeries applications...and all with standard tools!

Michael Sansoterra is a DBA for Broadway Systems in Grand Rapids, Michigan. He can be contacted at This email address is being protected from spambots. You need JavaScript enabled to view it..


BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$0.00 Raised:
$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: