Page 1 of 1

Problems concerning implicit trasformation of a string

Posted: Mon Aug 21, 2006 8:26 am
by Umbix62
Hi everybody

the problem I have today is about a trasformation. We are upgrading DataStage Server from the release 7.1 to 7.5.2. We are testing all Jobs. Two jobs of them have the same problem.

They extract data from an Oracle8i table and write them onto a flat file.

ORACLE-TABLE ------> SEQUENTIAL FILE.


The file "Flag_xy" is declared as a char(1) not null in the Oracle Stage and the same in the sequential stage rappresenting the target flat file. We don't use other stage between them.

May someone explain to me why in the terminal hashed file I have, in this field, the value "Y.." (59 00 00 ine hex format) for all records. Is it a new release bug. For all field of the Job used to carry the value from the source Oracle table to the target hashed file the null flag is set to false.

Note that in the previuos release the job works correctly and we have not changed it during the migration.

Thank you very much in advance

Umberto

Posted: Mon Aug 21, 2006 9:01 am
by ArndW
What is the column defined as in Oracle (not in the oracle stage, but in the describe of the database itself)? It looks like a CHAR(3) definition and it is padding with CHAR(000); in this case DS will use the actual data and not the metadata defined in the column.

Re: Problems concerning implicit trasformation of a string

Posted: Mon Aug 21, 2006 9:02 am
by chulett
Umbix62 wrote:May someone explain to me why in the terminal hashed file I have, in this field, the value "Y.." (59 00 00 ine hex format) for all records.
What 'terminal hashed file'? :?

Posted: Mon Aug 21, 2006 9:12 am
by Umbix62
ArndW wrote:What is the column defined as in Oracle (not in the oracle stage, but in the describe of the database itself)? It looks like a CHAR(3) definition and it is padding with CHAR(000); in this case DS will ...
No. It is defined as a char(1) not null. I suspected something similar and I verified immediatly. But the type of the filed in the Oracle table layer is correct.

Is it a little bug in the new DataStage release? If yes, does a fix exist?

Thank you

Umberto

Posted: Mon Aug 21, 2006 9:36 am
by ray.wurlod
You said it was a sequential file then you said it was a hashed file. Do you have a pad character set in the Columns metadata? If not, experiment with the pad character. Or run through a Transfomer to extract the leftmost character.

Re: Problems concerning implicit trasformation of a string

Posted: Mon Aug 21, 2006 9:36 am
by Umbix62
chulett wrote:
Umbix62 wrote:May someone explain to me why in the terminal hashed file I have, in this field, the value "Y.." (59 00 00 ine hex format) for all records.
What 'terminal hashed file'? :?
I sorry for that. I started with an example and after I talk about an other Job. There is no hashed file. Don't worry about that.

Posted: Mon Aug 21, 2006 9:37 am
by ArndW
add a transform stage and add a column with the derivation of "LEN(In.OracleColumn)" and it should be 1 and not 3

Posted: Tue Aug 22, 2006 3:07 am
by Umbix62
ArndW wrote:add a transform stage and add a column with the derivation of "LEN(In.OracleColumn)" and it should be 1 and not 3 ...
Ok, I'll try and after I'll report you.

Posted: Tue Aug 22, 2006 5:12 am
by Umbix62
ArndW wrote:add a transform stage and add a column with the derivation of "LEN(In.OracleColumn)" and it should be 1 and not 3 ...
Ok I tried to follow your suggestion. The result is one. The string is one byte long.

Oracle Table OCI Trasformer Hash file

Char(1) Char(1) Copy without any Char(1)
type of trasformation

Char(1) Varchar(255) Copy without any Char(1)
type of trasformation

In the first case if the input value, in the source oracle table, is "Y" in the target hash file datastage put the following value "Y.." (59 00 00 in hexadecimal format). Remember that in the previous datastage release the same job, reading data from the same oracle instance, works correctly.

In the second case the result is correct. if the input value, in the source oracle table, is "Y" in the target hash file datastage put the following value "Y" (59 in hexadecimal format).

At this point I think that it is a little bug related to the new release. What do you think about that?

Umberto

Posted: Tue Aug 22, 2006 5:18 am
by Umbix62
ArndW wrote:add a transform stage and add a column with the derivation of "LEN(In.OracleColumn)" and it should be 1 and not 3 ...
I think that now you can read better. It loss tab characters. I'm sorry for that.

Ok I tried to follow your suggestion. The result is one. The string is one byte long. This is a simple rappresentation of the Job.

Oracle Table -----> OCI ----------->Trasformer ------------> Hash file

Char(1) ----------> Char(1) ------->No Trasformation -----> Char(1)
Char(1) ----------> Varchar(255) ->No Trasformation -----> Char(1)

In the first case if the input value, in the source oracle table, is "Y" in the target hash file datastage put the following value "Y.." (59 00 00 in hexadecimal format). Remember that in the previous datastage release the same job, reading data from the same oracle instance, works correctly.

In the second case the result is correct. if the input value, in the source oracle table, is "Y" in the target hash file datastage put the following value "Y" (59 in hexadecimal format).

At this point I think that it is a little bug related to the new release. What do you think about that?

Umberto

Posted: Wed Aug 23, 2006 3:44 am
by ArndW
Very strange behaviour. If you output In.Colum[1,1] does it still add the two CHAR(000) places?

Posted: Wed Aug 23, 2006 7:46 am
by Umbix62
ArndW wrote:Very strange behaviour. If you output In.Colum[1,1] does it still add the two CHAR(000) places? ...
No, the jobs put the correct value in the output file in this case.

Posted: Wed Aug 23, 2006 8:28 am
by ArndW
OK. LEN('x':CHAR(000):CHAR(000)) is 3 but you are getting 0. Either DataStage has a bug in your case and is adding two characters to the record or you might be looking at the wrong data. Could you check to make 100% certain? Is this a server job (if it is a PX job then a possible cause is the row output formatting definitions, but that doesn't apply to server jobs).

(If it makes you feel better, I've done similar mistakes several times and was quite adamant about being right until I was proven wrong and had to slink away).