Learn how the latest DB2 for i features can reduce development time when interacting with the popular JSON data format.
In my prior tip about using JSON with DB2 for i, I articulated a method of accessing JSON with SQL by use of Java external user-defined functions (UDFs) that convert JSON to XML and XML to JSON. Starting in IBM i 7.1 TR10 (Database Group PTF Level 34) and IBM i 7.2 TR2 (Database Group PTF Level 5), IBM has added native JSON support to the DB2 for i database engine. Keep in mind that, at this first release, the JSON support is still considered a "technology preview," so be sure to test any JSON implementations carefully before using them in production. Also, based on available documentation, some of these features and JSON functions are not yet advertised by IBM for the IBM i and may be subject to change.
The new JSON support comes in the form of a command-line processor, a Java API, and SQL functions. In this tip, I will discuss the SQL functions that can be used to store and parse JSON. Creating a JSON document from relational data will not be discussed at this time as it is not available via the SQL interface, but interested readers can review the Java API for information on how to do this (see the references section for where to learn more about the command line and Java API features).
Why JSON?
JSON is a popular data format, with many modern APIs requiring JSON input and returning JSON output. Having easy-to-learn tools in the toolbox to work with JSON is definitely a boon to developers. You can learn more about JSON at JSON.org.
Getting Started
In order to make the JSON SQL functions accessible on your IBM i partition, the following operation must be executed once within QSHELL:
/QIBM/ProdData/OS/SQLLIB/bin/db2nosql -setup enable
This step will install various JSON-related SQL functions in the schema SYSTOOLS. To use these functions, be sure to qualify them with the SYSTOOLS schema or ensure your SQL path includes SYSTOOLS by using the SET PATH statement.
Understanding JSON (CLOB) and BSON (BLOB)
It's important to note that, when you're using the native JSON support, DB2 for i will store JSON (character) data in a BLOB (binary) column in a special format IBM calls BSON. So if you're building a table to store a JSON document and intend to use the native DB2 support for storage and retrieval, choose the BLOB data type for the JSON column, rather than a CLOB.
There are new scalar functions called BSON2JSON and JSON2BSON that are used to cast JSON data to BSON and vice versa (more about this in a moment). Note the current maximum size of a JSON document is 16MB (far less than the DB2 for i 2GB large object limit).
Storing JSON Data in a Table
Say you intend to build a DB2 table that stores information about flat files that are stored in Microsoft's OneDrive (formerly SkyDrive) cloud service. Further, a local copy of each OneDrive file is kept on the IFS of your IBM i. Moreover, information about each file is supplied by OneDrive as a JSON document. The table definition may look something like this:
CREATE TABLE MYLIB.FILE_INFO (
FILE_NAME VARCHAR(64) NOT NULL,
IFS_COPY_PATH VARCHAR(256) NOT NULL,
JSON_INFO BLOB(32K) NOT NULL)
Data, including a JSON document, can be inserted into this simple table as follows:
-- OneDrive Sample JSON Object (abridged)
-- https://msdn.microsoft.com/en-us/library/office/dn631834.aspx
INSERT INTO MYLIB.FILE_INFO
SELECT 'Processing.docx','/OneDrive/Processing.docx',
SYSTOOLS.JSON2BSON('{
"data": [
{
"id": "file.22688711f5410e6c.22688711F5410E6C!942",
"from": {
"name": "Mike Sansoterra",
"id": "22688711f5410e6c"
},
"name": "Processing.docx",
"description": null,
"parent_id": "folder.22688711f5410e6c.22688711F5410E6C!479",
"size": 12692,
"comments_count": 0,
"comments_enabled": true,
"is_embeddable": false,
"type": "file",
"shared_with": {
"access": "Everyone (public)"
},
"created_time": "2011-10-12T23:18:23+0000",
"updated_time": "2011-10-12T23:18:23+0000"
}
]
}
')
FROM SYSIBM.SYSDUMMY1
Notice that the JSON2BSON function will validate the JSON data and CAST it to the binary format (BSON) that DB2 requires for internal storage and processing. Once the JSON data is in the BLOB column, it will no longer be human-readable. To CAST the JSON data back from BSON to character, use the BSON2JSON function as shown here:
SELECT FILE_NAME,SYSTOOLS.BSON2JSON(JSON_INFO) AS JSON_INFO
FROM MYLIB.FILE_INFO
For the record, instead of being hard-coded as in the example, the JSON data from OneDrive would normally be retrieved via an API call. When working in DB2, this is most likely done with the HTTPGETCLOB function, which can be used to retrieve the file information from the Microsoft cloud-based REST API.
Validating BSON Data
Please note you can dump anything into the BLOB column intended to hold JSON data, but don't expect the DB2 for i JSON functions to work if you do not convert the content to BSON first! Using the JSON functions against invalid data will generate an error that looks something like this:
SQL State: 22546
Vendor Code: -171
Message: [SQL0171] Argument 11 of function JSONVAL not valid.
If you suspect something is amiss with your JSON data, you can test the validity of a particular column using the BSON_VALIDATE function. This query returns all rows where column JSON_INFO does not contain valid BSON information:
SELECT *
FROM MYLIB.FILE_INFO
WHERE SYSTOOLS.BSON_VALIDATE(JSON_INFO)=0 /* 1=Valid 0=Invalid */
Extracting Elements from JSON Data
The JSON_VAL scalar function is used to extract individual elements from a JSON document. It accepts three arguments: a BLOB (BSON) column or variable, a path to the desired element, and a result data type (i.e., the DB2 data type to use when extracting the JSON data). Valid values for the result data type argument are shown in the following table:
Result Type Function Return Type
'n' DECFLOAT(34)
'i' INTEGER
'l' BIGINT
'f' DOUBLE
'd' DATE
'ts' TIMESTAMP
't' TIME
's' VARCHAR
Consider the following query that extracts elements from the JSON_INFO column in the sample FILE_INFO table (refer to the sample INSERT statement to view the JSON doc):
SELECT JSON_VAL(JSON_INFO,'data.id','s') AS Id,
JSON_VAL(JSON_INFO,'data.name','s') AS FileName,
JSON_VAL(JSON_INFO,'data.from.name','s') AS FromName,
JSON_VAL(JSON_INFO,'data.shared_with.access','s') AS Access,
JSON_VAL(JSON_INFO,'data.size','l') AS Size,
TIMESTAMP(LEFT(REPLACE(
JSON_VAL(JSON_INFO,'data.created_time','s')
,'T',' '),19)) AS Created_Time -- Timestamp format doesn't work
FROM MYLIB.FILE_INFO
WHERE JSON_VAL(JSON_INFO,'data.name','s')='Processing.docx'
The above query returns the following:
ID |
FILENAME |
FROMNAME |
ACCESS |
SIZE |
CREATED_TIME |
file.22688711f5410e6c.22688711F5410E6C!942 |
Processing.docx |
Mike Sansoterra |
Everyone (public) |
12692 |
2011-10-12 23:18:23.000000 |
To understand how JSON_VAL works, consider the first column expression of the query. JSON_VAL(JSON_INFO,'data.id','s')is used to extract element data.id from the JSON data stored in column JSON_INFO and return the data type as a string. The data type DB2 assigns for a string is VARCHAR(2046). In other versions of DB2, a length can be specified (e.g., s:32), but DB2 for i currently doesn't appear to support this (but this may be a bug).
For a second example, expression JSON_VAL(JSON_INFO,'data.size','l') extracts the size element and returns it as a BIGINT data type.
You may have noticed in the query that I used some trickery to extract the created_time as a string and then convert it to a DB2 timestamp. The timestamp specified in the JSON document is the ISO 8601 format, but JSON_VAL doesn't seem to accept it; in fact, I couldn't get the timestamp result type (ts) to work at all after trying several different formats. Perhaps IBM is still work on this or I need some documentation!
What About Arrays?
The previous JSON parsing example is somewhat simple in that all of the data can be extracted in a single row. In the next TechTip, I will introduce the JSON_TABLE and JSON_TABLE_BINARY table functions, which can be used to extract array elements as rows in a result set.
Jamming with JSON
DB2 for i now includes the ability to store and parse JSON documents. IBM is doing a great job; it used to be Microsoft SQL Server would get new features long before IBM i. I've been pleased to see that features like row and column access (RCAC) security and JSON are only in preview with SQL Server 2016 (and in Azure). Great job, IBM! Because JSON is a popular format, especially in REST APIs, it behooves the IBM i developer to leverage the JSON processing capabilities possessed by DB2 for i. Finally, when coupled with the DB2 HTTP functions, it becomes a cinch to contact a JSON REST service in real time, return the results, and parse the data in just a few SQL statements.
References
IBM DB2 for i: JSON Store Technology Preview—A comprehensive overview of the JSON features available for IBM i
Four-Part Series on DB2 JSON capabilities (Note: not all of the described JSON capabilities are available in DB2 for i. Article 3 of this series describes the Java API that can be used to compose JSON documents.)
LATEST COMMENTS
MC Press Online