02
Sat, Nov
2 New Articles

Exploit IBM DB2 for i Database Development with JDBC 4.0

Typography
  • Smaller Small Medium Big Bigger
  • Default Helvetica Segoe Georgia Times

Learn how to exploit enhanced BLOB/CLOB support, SQL XML data type, and better connection management.

 

If you were, like me, watching closely the evolution of the JDBC specification, you probably noticed how it morphed from a rudimentary data access object model to a mature, function-rich programming interface. Typically, the Rochester development lab does a pretty good job in providing robust, JDBC-compliant drivers that you can use to access DB2 for i from Java applications. This tradition is continued with the latest version of DB2 for i drivers, which implement a host of JDBC 4.0 enhancements.

 

In this article, I provide a number of practical examples that illustrate how you can improve your productivity by tapping into the new APIs, such as enhanced BLOB/CLOB support, SQL XML data type, and better connection management.

Prerequisites

Currently, the JDBC 4.0 is implemented in the following DB2 for i JDBC drivers:

  • JTOpen, which is the open-source version of IBM Toolbox for Java
  • IBM Developer Kit for Java (a.k.a. native driver)

 

The JDBC 4.0 API Specification is part of Java SE 6. This means that the JDBC 4.0-compliant drivers do not work with earlier versions of Java Virtual Machine (JVM).

 

In the case of JTOpen, the Rochester development lab ships two versions of this driver:

 

  • jtopen_6_2.zip, the JDBC 3.0-compliant driver that supports earlier versions of Java
  • jtopen_6_2_jdbc40.zip, the JDBC 4.0 driver that requires Java 6 runtime

 

So, to take advantage of the new functionality, you need to install a Java 6 runtime and the JDBC 4.0 version of the JTOpen driver.

 

You may be wondering when the IBM Toolbox driver for Java license product will support the JDBC 4.0 specification. In this case, the development lab decided to continue to use the same jt400.jar name so that you would not need to update environments when you go to JDBC 4.0. Thus, you will not have to deal with multiple jars in multiple directories. The lab decided to replace the existing JDBC 3.0 jar in the licensed Toolbox for Java LPP when Java 1.6 becomes the lowest supported version on IBM i. At that time, JDBC 4.0 (Java 1.6 compiled) will magically replace the existing jar. The native JDBC driver developers adopted a slightly different approach. If your job uses one of the Java 6 JVMs, the system automatically switches to the JDBC 4.0-compliant version of driver.

What's New in the DB2 for i JDBC Drivers

The list of JDBC 4.0 enhancements was finalized very shortly before Java SE 6 became generally available. In fact, Sun pulled some proposed enhancements from the spec just before it was published. For example, the annotation-based SQL queries didn't make into the final version. So, let's start with a quick overview of the features that are part of the spec and see how they're supported in DB2 for i drivers:

 

Feature Overview

Feature

Description

Native

JTOpen

Comments

Automatic Driver Loading

The DriverManager implicitly loads the driver so that you don't have to remember the driver class name.

Y

Y

This feature requires Service Refresh 2 or later for the IBM J9 JDK 1.6 to work with the native driver. The feature works with GA version of the  classic JVM.

Enhanced BLOB/CLOB Support

New methods for creating, inserting, and releasing BLOBs and CLOBs

Y

Y

 

Client Info Support

The application can associate client-specific information, such as application name or client host name, with the connection object.

Y

Y

 

Enhanced Exception Management

Support for chained exceptions and use of the enhanced for-each loop to process them

Y

Y

 

National Character Set Support

Support for NClob interface, setter, and update methods added to the PreparedStatement, CallableStatement, and ResulSet interfaces to facilitate National Character Set conversions

Y

Y

 

ROWID Support

Support for the new RowId interface that can be used to access the SQL ROWID data type directly from a Java class

Y

Y

 

SQL XML Data Type

Support for SQLXML interface, methods for creating, inserting, and releasing SQLXML objects

Y

Y

DB2 for i does not support the XML data type yet. The driver automatically maps the Java SQLXML object to an appropriate DB2 data type, such as CLOB.

Wrapper Pattern

Support for Wrapper interface that allows applications to exploit vendor-specific behavior encapsulated in JDBC objects such as Connection

Partial

Partial

Wrapper support exists in the driver class hierarchy, but the class hierarchy does not have a single parent interface other than Sun's interface.  Consequently, the current interface/class hierarchy of the drivers limits its usefulness.

 

 Sample Code Walkthrough

As a software consultant, I often address questions on how to efficiently process BLOB/CLOB and XML data. So, to illustrate how new JDBC 4.0 features can be used to streamline the manipulation of such data types in Java, I coded up a couple of simple classes (see Taking Advantage of JDBC 4.0).

 

The IBM DB2 for i drivers have been recently enhanced with a host of JDBC 4.0 features, and they should be your primary choice if your Java applications access DB2 on IBM i. The drivers are optimized and tuned for DB2 access, and the appropriate licenses are available on the system free of additional charges (included in IBM i software).

 

Enhanced Blob Support Sample

 

The EnhancedBlobDemo class swaps a picture for one of the employee records stored in the EMP_PHOTO table, which is part of the DB2 sample database. You can create this sample database on your system by calling the following stored procedure:

 

call qsys.create_sql_sample('SAMPLE')

 

The stored procedure needs to be called from an SQL utility such as Run SQL Scripts in System i Navigator's GUI. Make sure the schema name that is passed as the input parameter is uppercased.

 

The business logic implemented in the class is quite straightforward: I retrieve the original picture as a BLOB, write this object into a binary stream file in the local file system (IFS if run on IBM i), read a new picture from a stream file into a BLOB, and finally update the current row with the new image. The following code snippet illustrates these actions:

 

query = "select empno, picture from sample.emp_photo " +

        "where empno ='000130' and photo_format = 'gif'";

pstmt = con.prepareStatement(query, ResultSet.TYPE_FORWARD_ONLY,

        ResultSet.CONCUR_UPDATABLE);                                 [1]   

rs = pstmt.executeQuery();                                           [2]

System.out.println("resultset retrieved");            

if (rs.next()) {

    Blob oldPict = rs.getBlob(2);                                    [3]

    this.writeBlobToFile(oldPict, "./OriginalPictureFor" +

            rs.getString(1).trim() + ".gif");                        [4]

    Blob newPict = this.readBlobFromFile("./NewPictureFor" +

            rs.getString(1).trim() + ".gif");                        [5]

    rs.updateBlob(2, newPict);

    rs.updateRow();                                                  [6]

    oldPict.free();                                                  [7]

    newPict.free();                                                 

}

 

At [1], a PreparedStatement object is created. Note that the statement is set up in such a way that it produces an updateable result set (at [2]). At [3], the value of the PICTURE column is retrieved into the BLOB object. Keep in mind that in order to optimize the BLOB processing, the driver associates a locator with the BLOB value. Typically, at this time the binary stream representing the picture is not materialized in the Java runtime. This actually occurs at [4] when the writeBlobToFile method is called to persist the picture in the local file system. At [5], the new picture is read from a stream file into the BLOB object. I'll explain the workings of the readBlobFromFile method in the next section. At [6], the image is used to update the PICTURE column of the current row in the result set. This works because I defined the ResultSet as updateable (at [1]). Finally, at [7], the resources held by the BLOB are released by invoking the free method on the BLOB interfaces, which has been added in JDBC4.0.

 

Let's now quickly glance over the source code of the readBlobFromFile method:

 

private Blob readBlobFromFile(String inputFileName) throws Exception {

     Blob blob = con.createBlob();                                  [1]

     File binFile = new File(inputFileName);

     InputStream bin = new FileInputStream(binFile);                       [2]

     int availableLength = bin.available();

     byte[] totalBytes = new byte[availableLength];

     int bytedata = bin.read(totalBytes);                                  [3]

     if (bin != null) {

         bin.close();

     }

     blob.setBytes(1, totalBytes);                                   [4]

     return blob;

}

 

The purpose of this method is to read a BLOB from a binary stream file and materialize it as an object in JVM. The createBlob() method that is invoked at [1] has been added to the Connection interface in the JDBC 4.0 specification. In the past, the lack of such a method was perceived by some developers as a painful omission in the spec. This new method returns an empty object whose class implements java.sql.Blob interface. At [2], an InputStream is instantiated. This stream is used to read the bytes from the file. At [3], the picture's binary image is loaded into a byte array. Then at [4], the stream is read into the Blob object instantiated at [1].

 

I encourage you to download and review the source code since it contains other programming techniques, such as automatic driver loading, that I will not cover in this article.

 

SQL XML Data Type

 

Currently, DB2 for i does not natively support the XML data type. However, you can still use DB2 as an XML repository. One possible approach is to use IBM middleware to facilitate XML to relational mapping. I refer here to the DB2 XML Extender license product that has been available on the platform for several years (see Additional Material for more info).

 

Another approach is to programmatically process XML and use the database as a data store. In fact, Java has a number of APIs that allow you to efficiently process XML documents: DOM, SAX, AND XPath to mention just a few. The support for the SQL XML data type in the JDBC driver allows you to tap into the capabilities of these powerful Java APIs. So you can manipulate XML objects in Java and then use the driver to manage the object's persistency. In the case of DB2 for i, the driver automatically converts the Java SQLXML data type into appropriate DB2 data types, such as CLOB or NCLOB. To illustrate these concepts, I coded up a Java class called EnhancedSQLXMLDemo. The class reads an XML document from a local file system and instantiates it as an SQLXML object. The document represents hardware tools sales data for one country grouped by region and brand. Here's a short excerpt to show the XML data hierarchy:

 

<CountrySalesByRegion Date="2006-04-07"

xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

xsi:noNamespaceSchemaLocation="CountrySalesByRegion.xsd">

  <CountryInfo>

    <Name>USA</Name>

  </CountryInfo>

  <Regions>

    <Region>

      <Name>East</Name>

      <Brand>

        <Name>Bosch</Name>

        <Sales>

          <Currency></Currency>

          <Amount></Amount>

        </Sales>

        <Returns>

          <Currency>USD</Currency>

          <Amount>39.95</Amount>

        </Returns>

      </Brand>

...

 

Once the document is materialized as an SQLXML object, it is stored in a DB2 table called JTEST2. The DDL for this table is shown below:

 

CREATE TABLE SAMPLE.JTEST2 (

       ID VARCHAR(10) CCSID 37 DEFAULT NULL ,

       XMLDATA CLOB(100M) CCSID 1208 DEFAULT NULL )  

 

Note that the CCSID 1208 keyword is used to indicate that the data needs to be stored in UTF-8 format.

 

In the next step, the data is retrieved from the table and materialized as an SQLXML object. The following code fragment shows how to perform these actions:

 

String psx = "SELECT xmldata FROM " + tableName +

             " WHERE id = ? ";

pstmt = con.prepareStatement(psx);                     [1]

pstmt.setString(1, "1000");

ResultSet rs = pstmt.executeQuery();                  

if (rs.next()) {

    SQLXML sx = rs.getSQLXML(1);                              [2]

}

 

At [1], a PreparedStatement is created. This statement is used to retrieve the XML document from the DB2 table. At [2], an SQLXML object is associated with a value stored in the table and retrieved through the ResultSet.

 

As mentioned, the driver will convert the CLOB data returned from DB2 into the SQLXML data type. Keep in mind, though, that the data will not be materialized in JVM until you explicitly read the data stream. This task is accomplished in the evaluateXPath method. This method allows you to use XPath expressions to retrieve relevant parts (nodes) from the XML document encapsulated in the SQLXML data type. For example, I can retrieve a list of all Sales Amounts from the XML document I described above using the following simple XPath expression:

 

//Sales/Amount/text()

 

Personally, I'm always amazed by the terse nature and incredible power of XPath. You can experiment with XPath using a sample class XPathTester that I included in the downloadable image. Let's quickly analyze the most important parts of the evaluateXPath method:

 

private void evaluateXPath(String query, SQLXML sx) {

  try {

      InputSource inputSource = new InputSource();                         [1]

      inputSource.setByteStream(sx.getBinaryStream());               [2]

      try {

          XPath xpath = XPathFactory.newInstance().newXPath();             [3]

          NodeList nodes = (NodeList) xpath.evaluate(query,

                  inputSource, XPathConstants.NODESET);                    [4]

          if (nodes == null) {

             System.out.println("XPath query must return a node-set");

          } else {

             for (int i = 0; i < nodes.getLength(); i++) {                 [5]

                 System.out.println("Name: " + nodes.item(i).getNodeName());

                 System.out.println("Value: " + nodes.item(i).getNodeValue());

             }

         }

      } catch (XPathExpressionException e) {

          System.out.println("Invalid XPath query: " + query);

      }

   } catch (Exception e) {

     e.printStackTrace();

   }

}

 

At [1], an InputSource is instantiated. Its input byte stream is associated with the SQLXML object's output stream. So the XML data is read into the InputSource directly from the SQLXML object (at [2]). At [3], the XPathFactory is used to create an XPath object. At [4], the XPath class is used to evaluate the XPath expression passed as input parameter. It returns a list of zero, one, or more text nodes that meet the search criteria. The results are displayed at [5].

 

Client Info

 

The monitoring and performance tuning of multi-tier applications is a non-trivial task. In a typical environment, an application server uses a connection pool manager to assign database connections to various applications running under its control. Although it is advantageous from a performance point of view, connection pooling makes it difficult to associate a given application with connections it is using. In this case, database monitoring becomes difficult, because it is hard to tell which application is hogging the system resources.

 

Luckily, the JDBC 4.0 spec introduces new APIs that you can use to set the connection's client-specific information, such as application name or host name of the client computer. Let's see how these new APIs work by analyzing another sample class called ClientInfoDemo. The class obtains a connection, reads the metadata to see which client info properties are supported by the database server, sets a number of these properties, and then reads them through a set of special database registers. The following code fragment shows the most important parts of the class' source code:

 

DatabaseMetaData dbmd = con.getMetaData();

ResultSet cipRS = dbmd.getClientInfoProperties();                          [1]

while (cipRS.next()) {                                                     [2]

    System.out.println(cipRS.getString("NAME") + " " +

            cipRS.getString("MAX_LEN") + " "

            + cipRS.getString("DEFAULT_VALUE").trim() +

            " " + cipRS.getString("DESCRIPTION").trim());

}

Properties clientInfoProps = new Properties();                             [3]   

clientInfoProps.setProperty("ApplicationName", "JDBCTester.ClientInfoDemo");

clientInfoProps.setProperty("ClientUser", "Jarek Miszczyk");

clientInfoProps.setProperty("ClientHostname", "dingo.rochestermn.ibm.com");

clientInfoProps.setProperty("ClientAccounting", "025-B105");

clientInfoProps.setProperty("ClientProgramID", "2008.07.11-ClientInfoDemo.java");

con.setClientInfo(clientInfoProps);                                        [4]

String query = "SELECT "                                                   [5]

         + "trim(CURRENT CLIENT_APPLNAME) AS "ApplicationName","

         + "trim(CURRENT CLIENT_USERID) AS "ClientUser","

         + "trim(CURRENT CLIENT_WRKSTNNAME) AS "ClientHostName","

         + "trim(CURRENT CLIENT_ACCTNG) AS "ClientAccounting","

         + "trim(CURRENT CLIENT_PROGRAMID) AS "ClientProgramID""

         + " FROM sysibm.sysdummy1";

pstmt = con.prepareStatement(query);

rs = pstmt.executeQuery();

if (rs.next()) {                                                            [6]

    System.out.println("ApplicationName  : " + rs.getString(1).trim());

    System.out.println("ClientUser       : " + rs.getString(2).trim());

    System.out.println("ClientHostname   : " + rs.getString(3).trim());

    System.out.println("ClientAccounting : " + rs.getString(4).trim());

    System.out.println("ClientProgramID  : " + rs.getString(5).trim());

}

 

At [1], the database metadata is used to retrieve the list of client info properties that are supported by the back-end database. The list is displayed on the standard output device at [2]. At [3], a Properties object is created, and then five client-specific properties are set by the application. At [4], the setClientInfo method is invoked to associate the properties with the current connection. At [5], an SQL statement is constructed to test if the properties were, in fact, properly set. This time, I use database runtime special registry entries to test the client info values. Note that the names of the special registers are similar to the JDBC properties but not identical. These names are database-vendor-specific.

 

Once the client info is set for a connection, it can be used to easily identify workloads in the server-side monitoring tools. For example, the DB2 for i database monitor collects and stores the client info for each SQL statement that is executed through a given connection. Figure 1 illustrates the content of the database monitor record for the SELECT statement executed by the ClientInfoDemo class to retrieve the special registers' content.

 

081308JarekFigure1.gif

Figure 1: Client info is recorded in the database monitor traces. (Click image to enlarge.)

 

With the application-specific info recorded in the database monitor, it should be fairly easy to scope the performance analysis to only those statements that were submitted by a given application.

Taking Advantage of JDBC 4.0

The IBM DB2 for i drivers have been recently enhanced with a host of JDBC 4.0 features, and they should be your primary choice if your Java applications access DB2 on IBM i. The drivers are optimized and tuned for DB2 access, and the appropriate licenses are available on the system free of additional charges (included in IBM i software).

Additional Material

Download the source code and readme file that accompanies this article.

 

The following publications can be helpful to those who want to learn more about the topics covered in this article:

 

The Ins and Outs of XML and DB2 for i5/OS, IBM ITSO Redbook

"DB2 Locking and Concurrency for Java Developers," MC Press article

Jarek Miszczyk

Jarek Miszczyk is a Lead Technical Consultant for System x Virtualization and Cloud Computing at the IBM STG Global ISV Enablement organization. He is located in Rochester, Minnesota. He can be reached by email at This email address is being protected from spambots. You need JavaScript enabled to view it..

BLOG COMMENTS POWERED BY DISQUS

LATEST COMMENTS

Support MC Press Online

$

Book Reviews

Resource Center

  • SB Profound WC 5536 Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application. You can find Part 1 here. In Part 2 of our free Node.js Webinar Series, Brian May teaches you the different tooling options available for writing code, debugging, and using Git for version control. Brian will briefly discuss the different tools available, and demonstrate his preferred setup for Node development on IBM i or any platform. Attend this webinar to learn:

  • SB Profound WP 5539More than ever, there is a demand for IT to deliver innovation. Your IBM i has been an essential part of your business operations for years. However, your organization may struggle to maintain the current system and implement new projects. The thousands of customers we've worked with and surveyed state that expectations regarding the digital footprint and vision of the company are not aligned with the current IT environment.

  • SB HelpSystems ROBOT Generic IBM announced the E1080 servers using the latest Power10 processor in September 2021. The most powerful processor from IBM to date, Power10 is designed to handle the demands of doing business in today’s high-tech atmosphere, including running cloud applications, supporting big data, and managing AI workloads. But what does Power10 mean for your data center? In this recorded webinar, IBMers Dan Sundt and Dylan Boday join IBM Power Champion Tom Huntington for a discussion on why Power10 technology is the right strategic investment if you run IBM i, AIX, or Linux. In this action-packed hour, Tom will share trends from the IBM i and AIX user communities while Dan and Dylan dive into the tech specs for key hardware, including:

  • Magic MarkTRY the one package that solves all your document design and printing challenges on all your platforms. Produce bar code labels, electronic forms, ad hoc reports, and RFID tags – without programming! MarkMagic is the only document design and print solution that combines report writing, WYSIWYG label and forms design, and conditional printing in one integrated product. Make sure your data survives when catastrophe hits. Request your trial now!  Request Now.

  • SB HelpSystems ROBOT GenericForms of ransomware has been around for over 30 years, and with more and more organizations suffering attacks each year, it continues to endure. What has made ransomware such a durable threat and what is the best way to combat it? In order to prevent ransomware, organizations must first understand how it works.

  • SB HelpSystems ROBOT GenericIT security is a top priority for businesses around the world, but most IBM i pros don’t know where to begin—and most cybersecurity experts don’t know IBM i. In this session, Robin Tatam explores the business impact of lax IBM i security, the top vulnerabilities putting IBM i at risk, and the steps you can take to protect your organization. If you’re looking to avoid unexpected downtime or corrupted data, you don’t want to miss this session.

  • SB HelpSystems ROBOT GenericCan you trust all of your users all of the time? A typical end user receives 16 malicious emails each month, but only 17 percent of these phishing campaigns are reported to IT. Once an attack is underway, most organizations won’t discover the breach until six months later. A staggering amount of damage can occur in that time. Despite these risks, 93 percent of organizations are leaving their IBM i systems vulnerable to cybercrime. In this on-demand webinar, IBM i security experts Robin Tatam and Sandi Moore will reveal:

  • FORTRA Disaster protection is vital to every business. Yet, it often consists of patched together procedures that are prone to error. From automatic backups to data encryption to media management, Robot automates the routine (yet often complex) tasks of iSeries backup and recovery, saving you time and money and making the process safer and more reliable. Automate your backups with the Robot Backup and Recovery Solution. Key features include:

  • FORTRAManaging messages on your IBM i can be more than a full-time job if you have to do it manually. Messages need a response and resources must be monitored—often over multiple systems and across platforms. How can you be sure you won’t miss important system events? Automate your message center with the Robot Message Management Solution. Key features include:

  • FORTRAThe thought of printing, distributing, and storing iSeries reports manually may reduce you to tears. Paper and labor costs associated with report generation can spiral out of control. Mountains of paper threaten to swamp your files. Robot automates report bursting, distribution, bundling, and archiving, and offers secure, selective online report viewing. Manage your reports with the Robot Report Management Solution. Key features include:

  • FORTRAFor over 30 years, Robot has been a leader in systems management for IBM i. With batch job creation and scheduling at its core, the Robot Job Scheduling Solution reduces the opportunity for human error and helps you maintain service levels, automating even the biggest, most complex runbooks. Manage your job schedule with the Robot Job Scheduling Solution. Key features include:

  • LANSA Business users want new applications now. Market and regulatory pressures require faster application updates and delivery into production. Your IBM i developers may be approaching retirement, and you see no sure way to fill their positions with experienced developers. In addition, you may be caught between maintaining your existing applications and the uncertainty of moving to something new.

  • LANSAWhen it comes to creating your business applications, there are hundreds of coding platforms and programming languages to choose from. These options range from very complex traditional programming languages to Low-Code platforms where sometimes no traditional coding experience is needed. Download our whitepaper, The Power of Writing Code in a Low-Code Solution, and:

  • LANSASupply Chain is becoming increasingly complex and unpredictable. From raw materials for manufacturing to food supply chains, the journey from source to production to delivery to consumers is marred with inefficiencies, manual processes, shortages, recalls, counterfeits, and scandals. In this webinar, we discuss how:

  • The MC Resource Centers bring you the widest selection of white papers, trial software, and on-demand webcasts for you to choose from. >> Review the list of White Papers, Trial Software or On-Demand Webcast at the MC Press Resource Center. >> Add the items to yru Cart and complet he checkout process and submit

  • Profound Logic Have you been wondering about Node.js? Our free Node.js Webinar Series takes you from total beginner to creating a fully-functional IBM i Node.js business application.

  • SB Profound WC 5536Join us for this hour-long webcast that will explore:

  • Fortra IT managers hoping to find new IBM i talent are discovering that the pool of experienced RPG programmers and operators or administrators with intimate knowledge of the operating system and the applications that run on it is small. This begs the question: How will you manage the platform that supports such a big part of your business? This guide offers strategies and software suggestions to help you plan IT staffing and resources and smooth the transition after your AS/400 talent retires. Read on to learn: