Reading comp-3 field from AS400 with Ascential driver
Moderators: chulett, rschirm, roy
Reading comp-3 field from AS400 with Ascential driver
Hi All,
I'm reading from a table in AS400 that contain comp-3 fields and I'm using the Ascential DB2 Wire Protocol driver.
Is there any way to translate comp-3 data without complex flat file stage?
Thanks,
Gil
I'm reading from a table in AS400 that contain comp-3 fields and I'm using the Ascential DB2 Wire Protocol driver.
Is there any way to translate comp-3 data without complex flat file stage?
Thanks,
Gil
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Could the value be written as a normal representation? If you are certain that the field contains COMP-3 then you must ensure that it is read as a binary field to make sure that no EBCDIC or other conversions are performed then use the SDK routine to convert.
Can you read the char, write it to a sequential file and get the HEX values of the first bytes to make certain that the format is what you expect?
Can you read the char, write it to a sequential file and get the HEX values of the first bytes to make certain that the format is what you expect?
-
- Premium Member
- Posts: 892
- Joined: Thu Oct 16, 2003 5:18 am
It would be helpful if you could be more specific in exactly what you are dealing with and trying to do with it. It might prevent people from floundering around too much while trying to help.
So... what are you saying, exactly? You have a 503 byte character field in your DB2 database. That field holds an entire COBOL generated record. The record contains COMP-3 fields that you are trying to get access to. Lastly, the record was 'compressed' somehow before it was written to this field. Does that about sum it up?
Seems to me the first task would be to 'uncompress' it after you retrieve it, which would imply that you'd need to know how it was compressed. Do you?
Assuming we're on track so far, then after this record is uncompressed you could use various methods to unpack the COMP-3 fields in the record, including via the use of the CFF stage or the transforms in the sdk.
If your situation is different than this, please clarify your situation - in as much detail as you can muster.
So... what are you saying, exactly? You have a 503 byte character field in your DB2 database. That field holds an entire COBOL generated record. The record contains COMP-3 fields that you are trying to get access to. Lastly, the record was 'compressed' somehow before it was written to this field. Does that about sum it up?
Seems to me the first task would be to 'uncompress' it after you retrieve it, which would imply that you'd need to know how it was compressed. Do you?
![Confused :?](./images/smilies/icon_confused.gif)
Assuming we're on track so far, then after this record is uncompressed you could use various methods to unpack the COMP-3 fields in the record, including via the use of the CFF stage or the transforms in the sdk.
If your situation is different than this, please clarify your situation - in as much detail as you can muster.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
I'm Sorry that I didn't answer fast and I'm sorrythat I'm not clear.
The problem is that all of this made by outsourcing a long time ago and I'm trying realy hard to understand what is going on here.
so...
chulett, this is basically everything you wrote.
The cobol prog. writes comp-3 fields but defines some kind of external layout (which I can't change) that have only one field - Char(503).
Now, from what I understand, I need to split this field to the compressed fields (i.e. S999V99 - I should take 3 letters) and then perform the transform that unpack comp-3.
Am I right?
And another problem, I see only transforms from Pic9 to Comp-3 and not the opposite. Are you sure there is Comp-3 to Pic9?
Thanks again
The problem is that all of this made by outsourcing a long time ago and I'm trying realy hard to understand what is going on here.
so...
chulett, this is basically everything you wrote.
The cobol prog. writes comp-3 fields but defines some kind of external layout (which I can't change) that have only one field - Char(503).
Now, from what I understand, I need to split this field to the compressed fields (i.e. S999V99 - I should take 3 letters) and then perform the transform that unpack comp-3.
Am I right?
And another problem, I see only transforms from Pic9 to Comp-3 and not the opposite. Are you sure there is Comp-3 to Pic9?
Thanks again
Don't be sorry about not answering fast, we'll get you whenever you come back. Now, you are allowed to be sorry about not being clear.
You didn't answer about the 'compressed' part. Is the compression you are talking about just the COMP-3 fields? Or was there some other form of compression done on the entire record before it went to DB2?
Do you have the COBOL FD that created this? At the very least, you'll need that to know how many and how long each field is so you can "split this field to the compressed fields" and then unpack them. Your example is correct if you want to do this one field at a time, or you could use the CFF stage to do them "all at once".
You'd also need to know, as Arnd asked earlier, if it is EBCDIC or ASCII for any of the character fields that may or may not be in there as well. If the record consists of nothing other than packed fields, then this point is moot. Well, unless someone 'translated' the packed fields from EBCDIC to ASCII in the process, in which case they are garbage.![Shocked :shock:](./images/smilies/icon_eek.gif)
![Wink :wink:](./images/smilies/icon_wink.gif)
You didn't answer about the 'compressed' part. Is the compression you are talking about just the COMP-3 fields? Or was there some other form of compression done on the entire record before it went to DB2?
![Confused :?](./images/smilies/icon_confused.gif)
Do you have the COBOL FD that created this? At the very least, you'll need that to know how many and how long each field is so you can "split this field to the compressed fields" and then unpack them. Your example is correct if you want to do this one field at a time, or you could use the CFF stage to do them "all at once".
You'd also need to know, as Arnd asked earlier, if it is EBCDIC or ASCII for any of the character fields that may or may not be in there as well. If the record consists of nothing other than packed fields, then this point is moot. Well, unless someone 'translated' the packed fields from EBCDIC to ASCII in the process, in which case they are garbage.
![Shocked :shock:](./images/smilies/icon_eek.gif)
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Hi,
Sorry again
Yes, this is comp-3 compression and its in EBCDIC but as I said before there is no routine (as far as I know) that unpack the comp-3.
I managed to get the COBOL FD. Most of fields are: PIC S999V99 COMP-3 and these fields I can't read. There are 3 PIC 9 fields which I managed to read.
Sorry again
![Embarassed :oops:](./images/smilies/icon_redface.gif)
Yes, this is comp-3 compression and its in EBCDIC but as I said before there is no routine (as far as I know) that unpack the comp-3.
I managed to get the COBOL FD. Most of fields are: PIC S999V99 COMP-3 and these fields I can't read. There are 3 PIC 9 fields which I managed to read.
Ok, so there is no other 'magic' compression other than what is native to the COMP-3 fields? Basically, all you've got is an EBCIDIC record in a database field. Fair enough.
First of all, there are routines to unpack fun stuff like this and it seems like you've already found them. In a previous post you mention 'transforms from Pic9 to Comp-3 and not the opposite' when in fact what you found is the opposite. The sdk routine DateTypePicComp3 says right in its description "Convert COBOL COMP-3 signed packed decimal into an integer". Give that one a whirl.
Or... take your FD and use it in conjunction with the CFF stage. It will do the 'splitting' and 'unpacking' for you automatically. That's exactly why it is there, after all.
First of all, there are routines to unpack fun stuff like this and it seems like you've already found them. In a previous post you mention 'transforms from Pic9 to Comp-3 and not the opposite' when in fact what you found is the opposite. The sdk routine DateTypePicComp3 says right in its description "Convert COBOL COMP-3 signed packed decimal into an integer". Give that one a whirl.
![Wink :wink:](./images/smilies/icon_wink.gif)
Or... take your FD and use it in conjunction with the CFF stage. It will do the 'splitting' and 'unpacking' for you automatically. That's exactly why it is there, after all.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Hi,
I can't use the CFF stage because the customer wants a direct access and not through FTP or XCOM or something like that. I guess there isn't a way to do it through CFF stage.
So, back to the unpacking, which NLS should I use? and should I change the field (the char(503) field) to binary?
Cause I tried to unpack with the routine and it didn't work.
I can't use the CFF stage because the customer wants a direct access and not through FTP or XCOM or something like that. I guess there isn't a way to do it through CFF stage.
So, back to the unpacking, which NLS should I use? and should I change the field (the char(503) field) to binary?
Cause I tried to unpack with the routine and it didn't work.
Direct access?
Not sure what exactly that means.
Can you not land this information and then process it? Write a job that extracts it from the database and writes it to a flat file, then process that flat file via the CFF stage. Or is that not direct enough access?
There's no "NLS" involved in unpacking. Packed decimal is packed decimal. You would need mcuh better explaination of what "didn't work" meant to get any useful help at this stage.
ps. I *loved* XCOM. Nothing quite like saving the Earth from alien invasion. Sorry, wrong XCOM... never mind.
![Confused :?](./images/smilies/icon_confused.gif)
Can you not land this information and then process it? Write a job that extracts it from the database and writes it to a flat file, then process that flat file via the CFF stage. Or is that not direct enough access?
There's no "NLS" involved in unpacking. Packed decimal is packed decimal. You would need mcuh better explaination of what "didn't work" meant to get any useful help at this stage.
ps. I *loved* XCOM. Nothing quite like saving the Earth from alien invasion. Sorry, wrong XCOM... never mind.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Hi,
"Didn't work" means that I get invalid data like this:
"Didn't work" means that I get invalid data like this:
About the CFF stage, its a good idea but the ODBC stage throughes warnings about the invalid data. this is the warning:????
Do you have a way to ignore these warnings?ds_mapnls() - NLS input mapping error, row 16, column KLLI
data '????'
external 011A1A02903F