Reading comp-3 field from AS400 with Ascential driver

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

gillu
Participant
Posts: 24
Joined: Sun Jan 29, 2006 3:40 am

Reading comp-3 field from AS400 with Ascential driver

Post by gillu »

Hi All,
I'm reading from a table in AS400 that contain comp-3 fields and I'm using the Ascential DB2 Wire Protocol driver.
Is there any way to translate comp-3 data without complex flat file stage?

Thanks,
Gil
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Hello Gil,

DB/2 does not have a COMP-3 data type. What is the defined data type of the column you are reading?
gillu
Participant
Posts: 24
Joined: Sun Jan 29, 2006 3:40 am

Post by gillu »

The field is Char(503) but the cobol program writes a comp-3 data into it.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

That's a heck of a large number!!! How much of the field is actually used? Have you checked out the data type conversion routines in the SDK?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
gillu
Participant
Posts: 24
Joined: Sun Jan 29, 2006 3:40 am

Post by gillu »

This is a large number because who ever build this table and the cobol program writes the whole record into one field and compress it.
I tried the DataTypePicComp3 but it doesn't seem to work.

(I appreciate the effort)
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Could the value be written as a normal representation? If you are certain that the field contains COMP-3 then you must ensure that it is read as a binary field to make sure that no EBCDIC or other conversions are performed then use the SDK routine to convert.
Can you read the char, write it to a sequential file and get the HEX values of the first bytes to make certain that the format is what you expect?
Sreenivasulu
Premium Member
Premium Member
Posts: 892
Joined: Thu Oct 16, 2003 5:18 am

Post by Sreenivasulu »

Hi,

It is useful if you take the large number as characters and then manipulate it.

Regards
Sreeni
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

It would be helpful if you could be more specific in exactly what you are dealing with and trying to do with it. It might prevent people from floundering around too much while trying to help.

So... what are you saying, exactly? You have a 503 byte character field in your DB2 database. That field holds an entire COBOL generated record. The record contains COMP-3 fields that you are trying to get access to. Lastly, the record was 'compressed' somehow before it was written to this field. Does that about sum it up?

Seems to me the first task would be to 'uncompress' it after you retrieve it, which would imply that you'd need to know how it was compressed. Do you? :?

Assuming we're on track so far, then after this record is uncompressed you could use various methods to unpack the COMP-3 fields in the record, including via the use of the CFF stage or the transforms in the sdk.

If your situation is different than this, please clarify your situation - in as much detail as you can muster.
-craig

"You can never have too many knives" -- Logan Nine Fingers
gillu
Participant
Posts: 24
Joined: Sun Jan 29, 2006 3:40 am

Post by gillu »

I'm Sorry that I didn't answer fast and I'm sorrythat I'm not clear.
The problem is that all of this made by outsourcing a long time ago and I'm trying realy hard to understand what is going on here.

so...
chulett, this is basically everything you wrote.
The cobol prog. writes comp-3 fields but defines some kind of external layout (which I can't change) that have only one field - Char(503).
Now, from what I understand, I need to split this field to the compressed fields (i.e. S999V99 - I should take 3 letters) and then perform the transform that unpack comp-3.
Am I right?

And another problem, I see only transforms from Pic9 to Comp-3 and not the opposite. Are you sure there is Comp-3 to Pic9?

Thanks again
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Don't be sorry about not answering fast, we'll get you whenever you come back. Now, you are allowed to be sorry about not being clear. :wink:

You didn't answer about the 'compressed' part. Is the compression you are talking about just the COMP-3 fields? Or was there some other form of compression done on the entire record before it went to DB2? :?

Do you have the COBOL FD that created this? At the very least, you'll need that to know how many and how long each field is so you can "split this field to the compressed fields" and then unpack them. Your example is correct if you want to do this one field at a time, or you could use the CFF stage to do them "all at once".

You'd also need to know, as Arnd asked earlier, if it is EBCDIC or ASCII for any of the character fields that may or may not be in there as well. If the record consists of nothing other than packed fields, then this point is moot. Well, unless someone 'translated' the packed fields from EBCDIC to ASCII in the process, in which case they are garbage. :shock:
-craig

"You can never have too many knives" -- Logan Nine Fingers
gillu
Participant
Posts: 24
Joined: Sun Jan 29, 2006 3:40 am

Post by gillu »

Hi,
Sorry again :oops:
Yes, this is comp-3 compression and its in EBCDIC but as I said before there is no routine (as far as I know) that unpack the comp-3.
I managed to get the COBOL FD. Most of fields are: PIC S999V99 COMP-3 and these fields I can't read. There are 3 PIC 9 fields which I managed to read.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Ok, so there is no other 'magic' compression other than what is native to the COMP-3 fields? Basically, all you've got is an EBCIDIC record in a database field. Fair enough.

First of all, there are routines to unpack fun stuff like this and it seems like you've already found them. In a previous post you mention 'transforms from Pic9 to Comp-3 and not the opposite' when in fact what you found is the opposite. The sdk routine DateTypePicComp3 says right in its description "Convert COBOL COMP-3 signed packed decimal into an integer". Give that one a whirl. :wink:

Or... take your FD and use it in conjunction with the CFF stage. It will do the 'splitting' and 'unpacking' for you automatically. That's exactly why it is there, after all.
-craig

"You can never have too many knives" -- Logan Nine Fingers
gillu
Participant
Posts: 24
Joined: Sun Jan 29, 2006 3:40 am

Post by gillu »

Hi,
I can't use the CFF stage because the customer wants a direct access and not through FTP or XCOM or something like that. I guess there isn't a way to do it through CFF stage.
So, back to the unpacking, which NLS should I use? and should I change the field (the char(503) field) to binary?
Cause I tried to unpack with the routine and it didn't work.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Direct access? :? Not sure what exactly that means.

Can you not land this information and then process it? Write a job that extracts it from the database and writes it to a flat file, then process that flat file via the CFF stage. Or is that not direct enough access?

There's no "NLS" involved in unpacking. Packed decimal is packed decimal. You would need mcuh better explaination of what "didn't work" meant to get any useful help at this stage.

ps. I *loved* XCOM. Nothing quite like saving the Earth from alien invasion. Sorry, wrong XCOM... never mind.
-craig

"You can never have too many knives" -- Logan Nine Fingers
gillu
Participant
Posts: 24
Joined: Sun Jan 29, 2006 3:40 am

Post by gillu »

Hi,
"Didn't work" means that I get invalid data like this:
????
About the CFF stage, its a good idea but the ODBC stage throughes warnings about the invalid data. this is the warning:
ds_mapnls() - NLS input mapping error, row 16, column KLLI
data '????'
external 011A1A02903F
Do you have a way to ignore these warnings?
Post Reply