Search found 17 matches

by radarada
Thu Jan 14, 2010 9:42 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: DS_JOBOBJECTS
Replies: 3
Views: 2407

DS_JOBOBJECTS

Anyone know the structure of the DS_JOBOBJECTS HASH file?
by radarada
Tue Jan 05, 2010 12:03 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DS_JOBS
Replies: 5
Views: 2361

I checked the JOB_NAME off as the key and it worked...odd
by radarada
Tue Jan 05, 2010 11:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DS_JOBS
Replies: 5
Views: 2361

DS_JOBS

I am trying to use the DS_JOBS to obtain the path listing and job name. I have Server job that works but give peculiar results. When I use view data in the Hash File to retrieve JOB_NAME from DS_JOBS it will show me the correct name....however when I run the job to load the info into a table or file...
by radarada
Tue May 26, 2009 2:14 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: APT_TeraSync: could not create operator sync table
Replies: 1
Views: 1956

APT_TeraSync: could not create operator sync table

Ok I have read through all psotings on this subject and none of them appear to have my answer. Quick summary. We can pull the data from a server job no issues. We cannot pull using a parallel job. We have checked with the Teradata DBA and our user id has select access to the table we are accessing a...
by radarada
Wed Mar 25, 2009 12:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: job validation not using message handler
Replies: 1
Views: 1094

job validation not using message handler

When validating a job is it not possible to use the message handler? Currently I can only get my messages to be demoted when I run the job. I cannot get them demoted during validation.
by radarada
Thu Feb 05, 2009 12:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Conversion error converting input field
Replies: 4
Views: 2745

Length 38 scale 10 same as 38,10
by radarada
Thu Feb 05, 2009 12:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Conversion error converting input field
Replies: 4
Views: 2745

Conversion error converting input field

I have a VALUE field of 38,10 and I am using the trim(DecimalToString(mv_CRAFTOUT.VALUE ,"suppress_zero")) but I am still getting this error xfmLoadDS,1: Conversion error converting input field VALUE to output field VALUE, data may have been lost [transform/tfmop_functions.C:131] I have in...
by radarada
Sun Nov 30, 2008 2:14 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Cannot read large number value
Replies: 7
Views: 2804

Not to dwell on this issue but an Oracle Number with no precision or scale defined can still allow decimals up to 10. By default a NUMBER is really NUMBER(38,10) in oracle. Defining it as 30,0 as suggested would not allow any decimals (if a number had decimals). I should be able to load any number i...
by radarada
Sun Nov 30, 2008 11:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Cannot read large number value
Replies: 7
Views: 2804

An Oracle NUMBER corresponds to a DECIMAL(38,10) in DS. The target is also NUMBER.
by radarada
Sun Nov 30, 2008 10:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Cannot read large number value
Replies: 7
Views: 2804

Cannot read large number value

DS is reading from an oracle table that contains a NUMBER data type. One of the records in this field is a large number of 30 7's. 777777777777777777777777777777 I cannot get DS to load this value into a Decimal 38,10. DS is unable to read it in using a Decimal (38,10). The only way for the value to...
by radarada
Fri Nov 07, 2008 12:42 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: metadata mismatch
Replies: 2
Views: 1754

right but the error is telling me that DS is reading that the source has a string 32 and i am trying to put it into a string 1. The source has a char(1) and my job has a char(1). Where is it getting the string 32?
by radarada
Fri Nov 07, 2008 12:14 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: metadata mismatch
Replies: 2
Views: 1754

metadata mismatch

The source metadata was imported through the orchestrate schema. The field in the table is defined as a CHAR(1) however when the job runs the Oracle stage throws out the below error. Implicit conversion from source type "string[max=32]" to result type "string[1]" This field does ...
by radarada
Wed Nov 05, 2008 11:35 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: decimal import error from text file
Replies: 10
Views: 5817

The file that was being created was adding the unix ^M onto the end of each row. I was never able to figure out a way for DS to see the ^M as the final delimiter so we had to request the files creator to be sure and remove the ^M prior to submitting.
by radarada
Mon Nov 03, 2008 8:18 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: decimal import error from text file
Replies: 10
Views: 5817

I can get the import to work with other data types such as FLOAT,DOUBLE,Int,BigInt,etc...(however it distorts the real number) For what ever reason the Decimal cannot be used. If I change the value to a varchar and bring it in it will bring it in with an odd box at the end of the last character.
by radarada
Sun Nov 02, 2008 9:39 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: decimal import error from text file
Replies: 10
Views: 5817

If I view the file in TextPad,TOAD FTP or import into excel no value exists at the end of the record. I have tried all the Final Delimeter's (none,end,whitespace). At a previous place we insisted all files end with a pipe delimiter and I think it was for the same reason. I will move forward with tha...