MVS 390, Char columns with complex data types

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

MVS 390, Char columns with complex data types

Post by roy »

Hi All,
:arrow: This is a MVS / 390 Related Topic :!:
I'm facing a case where there are db2 table fields of type char(x) that holds an array of records (by occurrs) or a records with redefines.
The redifines are less of an issue since only one type of record is relevant.
The problem is that in those char columns are actually holding a sub-record somposed of both character and numeric data (some holds local NLS chars) and I need to decompose them using DS.

:arrow: Any help would be apreciated

P.S.
In my desposal are both 390 and server jobs.

Thanks in advance for your time :)
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Have you looked into using the Complex Flat File stage in server jobs, which were created expressely for this type of data.
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

My problem is that my source is a DB2 table :!:
and correct me if I'm wrong DS only reads CFF files, it can't create them :cry: adding the fact I was told they won't build me a "normal" file I can read representing a sain version of the data I'm stuck with 3 people to help me named: MMI (Me Myself & I; sorry Craig :roll:, forgive the explained first time used acronim will you :?: )
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Have you tried staging the table in a flat file and then using the CFF to read the file?
We have a similar situations in my current shop where the arrays are just jammed into a single column. WHen the time comes to process them, I drop it into a staging file, create a cfd, and let the cff stage do the rest.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Why not use a Relational Database stage in a mainframe job?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

And since you are in Server, you can read the DB2 table as suggested above, then use a named-pipe sequential file for the cff source and thus don't even need to land your data on disk.
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

The string contains comp-3 elements that do not read well via db2 connect as a string.
There is no logic for determining what kind of record were dealing with, perhaps since it seems only one kind of record is currently used.

I eventually used a substr function in the sql to get the fields I need
Then used a SV to check Is Numeric to validate my numeric data, else I put Zeros and in the derrivation an implicit cast to decimal data type.

As for the Occurrs, a REXX program was built to decompose the data to seperate records (we could have done it in DS just as well, but it wasn't my call).
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

:arrow: :idea: So I think I failed to make cllear what was the solution we used :roll:
So to complete the picture from my previous post ...

We used Mainframe jobs to bring the data to Teradata work tables.
that includes handling of all occurs/redefines and such interesting variants.
Once the work tables were populated in a normal relational format we continued the business logic in server jobs.

IHTH,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
Post Reply