Category Archives: IBM i

Better date conversion with timestamp_format

Some time ago, I discussed the never-ending problem of converting a six digit number into an ISO date. I have since found a better way of doing this and, in order to save myself a search through the IBM Knowledge Center every time I need to do this, I’m putting it up here.

So here’s the problem: I have a table in which the date is represented as a 7 digit number. The first digit is the century (0 indicates 20th century, 1 indicates 21st century) and the next six digits represent the date in YYMMDD format. So 1150831, for example, is 31st August 2015.

The request in hand is for a report, so I want to convert the number into a date so that it looks sensible once exported into Excel. Timestamp_format is your friend.

The TIMESTAMP_FORMAT function returns a timestamp that is based on the interpretation of the input string using the specified format.

It’s that “interpretation of the input string” that makes this so handy. Put simply: you tell the function how to interpret the string and it will do the rest.

Here’s an example. I have a contracts table (CONTRACTS) with three columns: Contract Number (CTNR), Start Date (STDT) and End Date (ENDT). The start and end date are both seven digit numbers as described previously. I need to list all of the currently active contracts (Start Date is before Today and End Date is after Today).

And the SQL looks like this:

select CTNR,
       date(timestamp_format(substr(digits(STDT), 2, 6), 'YYMMDD')),
       date(timestamp_format(substr(digits(ENDT), 2, 6), 'YYMMDD'))
where date(timestamp_format(substr(digits(STDT), 2, 6), 'YYMMDD')) <= current_date
and date(timestamp_format(substr(digits(ENDT), 2, 6), 'YYMMDD')) >= current_date

Current_date is a special register that returns the current date, but that’s not important right now.

The date conversion part involves several nested functions to get to a final date, so here’s the breakdown using a date of 1st September 2015 (1150901):

  • digits(STDT) converts the 7 digit number into a 7 character string. So 1150901 is converted to ‘1150901’.
  • The IBM i can figure out which century it’s in by looking at the year, so I don’t need the century flag and the substr(digits(STDT), 2, 6) strips it out to give me a value of ‘150901’
  • The timestamp_format function takes the date string and uses the format string of ‘YYMMDD’ to generate a timestamp of 2015-09-01- Since I only passed it a date, the hours, minutes and seconds are all zeroes.
  • And finally, I can use the date function to retrieve the date from the generated timestamp.

And here’s the result:

CTNR      DATE      DATE
C1000001  01/09/15  31/08/16

Pretty, and portable.

Flattr this!

Using QShell and CL to work with Stream files in the IFS

It turns out that there is no simple way of generating a list of files currently residing in a folder on the IBM i IFS. Simple, in this case, would be a command like DSPLNK OUTPUT(*FILE). An API does exist, but a combination of not enough time and too much lazy proved to be quite a disincentive for me going down that route.

The issue was that we recieve a number of very big files and, to save a bit of bandwidth, these files were being zipped before being sent to the IBM i. Dropping the file into the IFS and unzipping it was easy enough but then I found myself with an archive folder containing one of more files. While I can set the name of the folder into which the files should be extracted, I have no way of determining beforehand what the file names will be.

Here’s a solution:

/* -------------------------------------------------------------------------- */
/* Program     : EXTRACTZIP                                                   */
/* Description : Retrieve and unpack a zipped archive                         */
/* Written by  : Paul Pritchard                                               */
/* Date        : 27/05/2015                                                   */
/* -------------------------------------------------------------------------- */
dcl &library *char 10 value('MYLIB')
dcl &fromfile *char 50
dcl &tombr *char 50 value('/qsys.lib/qtemp.lib/IMPORTP.file/IMPORTP.mbr')
dcl &dltfile *char 50

/* Retrieve and the zipped file and unzip it.                                 */
/* I won't bore you with the details here, but the the production program is  */
/* retriving the ZIP file from an FTP server and using PKZIP to unzip it.     */
/* The extract directory is /home/EXPATPAUL/EXTRACTFLR which now contains one */
/* or more stream files                                                       */

/* Retrieve a list of extracted files                                         */
/* First, I use QShell to list the files in EXTRACTFLR. The output of this is */
/* redirected to ExtractedFiles.TXT.                                          */
/* In order to use this information, I copy the ExtractedFiles.TXT stream     */
/* to an ad-hoc physical file (QTEMP/EXTRACTP)                                */
qsh cmd('ls /home/EXPATPAUL/EXTRACTFLR/ > /home/EXPATPAUL/ExtractedFiles.TXT')
crtpf file(QTEMP/EXTRACTP) rcdlen(20)
cpyfrmstmf fromstmf('/home/EXPATPAUL/ExtractedFiles.TXT') +
           tombr('/qsys.lib/qtemp.lib/EXTRACTP.file/EXTRACTP.mbr') +

/* And now I can use QTEMP/EXTRACTP to drive my way through EXTRACTFLR and    */
/* copy each of the files in the archive into the IMPORTP physical file.      */
dowhile '1'
    monmsg msgid(CPF0864) exec(LEAVE)

    /* Copy the next sream file from the archive                              */
    chgvar &fromfile value('/home/EXPATPAUL/EXTRACTFLR/' *tcat &EXTRACTP)
    cpyfrmstmf fromstmf(&fromfile) tombr(&tombr) mbropt(*add) +

    /* and then delete the stream file                                        */
    chgvar &dltfile value('rm /home/EXPATPAUL/EXTRACTFLR/' *tcat &EXTRACTP)
    qsh cmd(&dltfile)


/* Clean up and exit                                                          */    
qsh cmd('rm /home/EXPATPAUL/ExtractedFiles.TXT')
dltf qtemp/EXTRACTP


It should go without saying that some of the names have been changed and that the above program should be treated as a sample only.

Being able to move information between the QShell/IFS and traditional i5/OS environments is both useful and (in my experience) increasingly important. Although it does take a bit of thinking about, it isn’t difficult which is why I find that the oft-seen solution of “buy this tool” is both disappointing and (often) overkill.

Flattr this!

It is done

I mentioned last month that I was in the process of consolidating the mess of repositories that I have created over the past few years. This is now done and, hopefully, will be a bit more manageable going forward.

I now have three consolidated repositories: silliness, utilities and utils-on-power. The Configurate repository is already logically enough organised and Squirt is large enough to justify its own repo.

I have also removed the static pages referring to these repos on this site. They were creating overhead without adding any value.

And I now know a lot more about Git that I knew this time last month.

Flattr this!

Adding variables to ad-hoc SQL queries with REXX

It’s incredible how easily I can be distracted. All I needed was a quick and dirty way of capturing the input into an interface and now I’m writing a blog post.

Everything starts with an issue and, in this case, the issue is with an interface not behaving as expected in production even though everything works as expected in the test environment. My suspicion is that the incoming data is not correctly formatted, causing transactions to not meet the selection criteria. But to confirm this, I need to see exactly what is waiting to be processed when the interface runs.

Since this interface runs after the end of business I want to be able to submit an SQL query to capture the input data into a separate table so that I can see what was processed when I come in tomorrow morning. And, because much of this data is not destined to pass through the interface in question (this is why we have selection criteria) I want to be able to select today’s transactions for whatever the current value of today happens to be.

In a sane world, this would be a simple case of using the current date but in the real world there are still people using eight digit numbers to represent the date. This leads to some unwieldy date calculations in SQL which led me to wondering whether I could capture this in a variable to make things a bit more readable. It turns out I can, but not on the version of the IBM i operating system that I am currently using.

What really caught my eye at the above link however, was this:

but if “ad hoc sql + variables” is the thing you need, you should really try Rexx.

And I thought: “Rexx. That’s something I haven’t used since last century.”

Any excuse to download a couple of manuals is reasonable as far as I’m concerned, so off I went.

And here’s the script (library and table names have been changed):

datenow = DATE('S')
statement = 'insert into MYLIB/INPUTDATA' ,
            'select * from PRODLIB/INPUTDATA where dhdate =' datenow
address 'EXECSQL'
EXECSQL statement

The DATE function returns the current date and the 'S' option handily reformats it into an eight digit number (yyyymmdd). The address 'EXECSQL' statement handily points the subsequent commands to the SQL command environment and then it’s just a case of executing the statements I need to execute.

It all works very nicely indeed.

On a tangential note, I noticed that Rexx is also available in Arch. I may well install this sometime soon just to have a play around with the language.

Flattr this!

IBM i Access Client Solutions… on Arch

This is a follow up to my earlier post about connecting to a cloudy AS/400 (yes, an actual AS/400 running V5R3M0).

As I mentioned at the time, I had run into problems installing iSeries Access because IBM had removed the RPMs from their site, I asked about this and the package maintainer very helpfully provided me with a collection of links to the various versions. However, he also mentioned that the source files were removed from the IBM website because they want everyone to use the IBM i Access Client Solutions. This is in the AUR as iacs, so I thought I’d try this first.

It works… beautifully.

And I do love a snappy application name.

Flattr this!

AS/400… In the cloud

I never thought I would be able to use that as a post title but, as reported by The Register, German hosting company Rechenzentrum Kreuznach has popped an AS/400 into the cloud, and anyone can use it for free. I’m anyone so I signed up.

Of course, there isn’t much point in having an account if I don’t have a terminal. Fortunately, Arch has everything.

I first tried tn5250 which proved to be a nice little package that can be started from the terminal. It certainly works and achieves exactly what it attempts. The only problems I encountered were that some of the key mappings were a bit odd (probably as a result of me using the wrong character map) and (more seriously) that running one terminal inside another can cause a little command key confusion.

It was at this point that I noticed that the AUR actually includes iSeries Access. Unfortunately, this is proving to be a bit of a struggle – the package maintainer appears to have assumed that I’ve already downloaded the RPM, which I have been unable to find. I’ve left a comment on the package asking about this and will come back to it if I am able to find the RPM somewhere… anywhere.

(Tangentially: How does IBM manage to continually build such awful websites? Every time I have to negotiate Big Blue’s labyrinthine online presence, I find myself faced with sites that are slow, clunky, painful to navigate and – all too often – completely inconsistent.)

So I turned to the TN5250 Java Edition. Installing and configuring this turned out to be a completely painless process, and I’m in.

To tell the truth, I’m not sure what – if anything – I will do with this. But it’s always fun to poke around an older bit of kit, if only to remind myself how far things have progressed over the past decade.


The title of this post was changed at the request of Source Data who own the trademark for Cloud/400 and have asked me to avoid causing any confusion.

Flattr this!

Keeping Qshell sessions alive

Not a lot of people realise this, but the IBM i has a POSIX compliant shell environment, known as Qshell. It’s relatively basic (compared to both BASH and the native i command line) but it can be quite handy when I need (for example) to grep a source file.

One thing that has always annoyed me about Qshell, however, is that it doesn’t retain any history between sessions. Given that my workflow will involve starting at the i command line, performing a task in Qshell, and then returning to the command line, the lack of a history lead either to unnecessary typing or copying and pasting commands into and out of a text editor.

Today I noticed that the F12 key can be used to disconnect a Qshell session without actually ending it. And when I next enter the QSH command, I find myself back in the same session with my history intact.

This isn’t going to help with finding commands I typed yesterday, but it will allow me to avoid unnecessary retyping within the same day.


Why use grep to search a source file rather than the more usual FNDSTRPDM command?

Incompetent contractors is the short answer. Incompetent contractors who introduced an unknown number of divide by zero errors is the slightly longer answer.

In RPG, the division operator is / and the comment symbol is //. I could use FNDSTRPDM to search for all the slashes and then manually scroll past all the comment lines. Or I could shortcut this process with the following piped grep:

grep -in '/' /qsys.lib/sourcelib.lib/qrpglesrc.file/program.mbr | grep -iv '//'

I’m lazy. I grep.

Flattr this!

Using SQL to update one file from another

With a recent interface change, I was asked to go back and fix all of the affected historical data so that it matched the new requirements. Updating a field in one file with a value from a different file is something I have done several times in the past (far too many ah-hoc queries have been launched, if truth be told) and, while you do need to take a bit of care, the approach is pretty simple.

So here is an example (some names have been changed to protect the proprietary):

update target_file upd
set target_field = (select source_field from source_file
                    where source_key_1 = substr(upd.target_key, 1, 16)
                    and digits(source_key_2) = substr(upd.target_key, 17, 5) )
where exists (select source_field from source_file
              where source_key_1 = substr(upd.target_key, 1, 16)
              and digits(source_key_2) = substr(upd.target_key, 17, 5) )

The script is pretty simple. For each record in target_file for which an associated record can be found in source_file, populate target_field with the value in source_field. Obviously, the select clauses will need to reflect the relevant keys of whatever files you happen to be using.

Inevitably, there is a gotcha: for each record in target_file that you want to update, there must be exactly one record returned by the subquery. Handling this can be split into two parts.

The first part is handled by the where exists clause which ensures that the script will only attempt to update records in target_file if there is a record in source_file with which to update it. This ensures you don’t get caught out by subqueries that return zero records.

The second part involves that the subquery returns no more than one record for each record in target_file. This, unfortunately, cannot be solved generically – you just need to be a bit careful to ensure that the subquery selection is returning unique records. If in doubt, a variation on the below SQL can be used to validate.

select source_key_1, digits(source_key_2), count(*)
from source_file
group by source_key_1, digits(source_key_2)
having count(*) > 1

If you can’t find a unique selection criteria, the distinct clause may help and, if all else fails, try arbitarily using either max() or min().

Flattr this!

Executing SQL statements from within a CL with RUNSQL

Here’s a CL Command I didn’t know about, RUNSQL, which allows an SQL statement to be run from within a CL program without needing a source file.

Inevitably, I found this because I was looking for a quick solution to a transitory problem. I won’t bore you with the details but, what I wanted to do was create a temporary file, populate it with orders for a (user specified) date and then copy this file into a library that has been made available to an external ftp application.

Here is what I ended up with (field, file and library names have been changed for clarity and to protect the innocent):

  pgm &date
  dcl &date *char 8
  dcl &statement *char 150
  dcl &nbrrcd *dec (10 0)
  cpyf fromfile(template/orders) tofile(qtemp/orders) +
       mbropt(*add) crtfile(*yes)
  clrpfm qtemp/orders
  chgvar &statement value('insert into qtemp/orders +
                           select * from archlib/orders  +
                           where msgdat = ' *cat &date)
  runsql sql(&statement) commit(*none)
  rtvmbrd qtemp/orders nbrcurrcd(&nbrrcd)
  if cond(&nbrrcd *gt 0) then(do)
     cpyf fromfile(qtemp/orders) tofile(ftplib/orders) +
          mbropt(*add) crtfile(*yes)
  dltf qtemp/orders

It’s all pretty simple stuff but being able to embed the SQL statement right in the CL makes a conceptually simple solution very simple to implement.

Flattr this!