Search found 137 matches

by G SHIVARANJANI
Thu Jul 26, 2007 7:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Drop Due to NULL
Replies: 14
Views: 3755

Re: Records Drop Due to NULL

and these columns are nullable columns.
by G SHIVARANJANI
Thu Jul 26, 2007 7:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Drop Due to NULL
Replies: 14
Views: 3755

Records Drop Due to NULL

Hi, I am loading a table with a data coming from csv file, It has null values in it, the column which has null is being used in stage variable and constraints, if not used in constraints, and stage variable then it never drops the records...if used it does so. i have used isnull to check if the colu...
by G SHIVARANJANI
Tue Jul 24, 2007 10:35 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: RowNumberColumn
Replies: 3
Views: 1348

Hi, I have tried putting start position and default value as 1 but it dint work out.It starts with 0... and as i am taking this column as a key column for join, the join stage is giving an error saying: " partitioner requires a key column" Note:i have put up hash partitioning with key as r...
by G SHIVARANJANI
Tue Jul 24, 2007 10:12 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: RowNumberColumn
Replies: 3
Views: 1348

RowNumberColumn

Hi,


How to initialise RowNumberColumn to 1 instead of the default value 0.

ThankU,
by G SHIVARANJANI
Mon Jul 23, 2007 9:04 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with sequence
Replies: 4
Views: 1579

Problem with job and sequence parameters,

As the sequence parameter had a an extra space in it which i overlooked.

its working fine now.

Thanku.
chulett wrote:Post the actual errors. ...
by G SHIVARANJANI
Mon Jul 23, 2007 9:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with sequence
Replies: 4
Views: 1579

Problem with job and sequence parameters,

As the sequence parameter had a new an extra space in it which i overlooked.

its working fine no.

Thanku.
chulett wrote:Post the actual errors. ...
by G SHIVARANJANI
Mon Jul 23, 2007 6:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Problem with sequence
Replies: 4
Views: 1579

Problem with sequence

Hi, I have a sequence which gets aborted when run. it says due to unrecoverable errors, for this i compiled the jobs used in the sequence. the log of the job used in sequence says, "couldnot open the file " but when that job is run individually...they run fine. What could be the problem th...
by G SHIVARANJANI
Mon Jul 23, 2007 4:08 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Please suggest
Replies: 6
Views: 2478

I could see that this can be achieved by using transformer, but this goes fine with two columns ..

its going complex with the three column data...as shown

ray.wurlod wrote:Search the forum for "vertical pivot", which is what you are trying to do. There are several ways to effect this technique. ...
by G SHIVARANJANI
Sun Jul 22, 2007 10:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Please suggest
Replies: 6
Views: 2478

Please suggest

Hi,


I have some records in this manner,

A,AT1,GT1
A,AT1,GT2
A,AT2,GT1
A,AT2,GT2
A,AT3,GT1
A,AT3,GT2

OR

A,AT1,GT1
A,AT1,GT2
A,AT1,GT3
A,AT2,GT1
A,AT2,GT2
A,AT2,GT3




And i want the resultant recordS as

A,AT1,AT2,AT3,GT1,GT2 and

A,AT1,AT2,GT1,GT2,GT3


Please suggest.
by G SHIVARANJANI
Sun Jul 22, 2007 10:26 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Dropped
Replies: 16
Views: 5710

Re: Records Dropped

Ok
I have done this.

thanku.
Zhang Bo wrote:you should change both the input and output nullable to YES,and this warning must be removed
by G SHIVARANJANI
Sun Jul 22, 2007 9:14 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Dropped
Replies: 16
Views: 5710

Hi, Actually i am moving the data from a table to a text file .so this leads to make it work. Thankuu.. I don't think that's the right way to do it. Because you are passing a SPACE character (' ') when it is null. This would need you to put a logic which converts the SPACE into a null before it inse...
by G SHIVARANJANI
Sat Jul 21, 2007 8:32 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Dropped
Replies: 16
Views: 5710

Hi, It worked fine when gave the query as: SELECT NVL(TO_CHAR(TRUNC(LAST_UPDATE_DATE),'YYYYMMDD'),' ') LAST_UPDATE_DATE FROM <<TABLE>> Can you give us the design of the job, It would make the task of fixing up this warning easier. But, My guess is that you are using this column --> 'LAST_UPDATE_DATE...
by G SHIVARANJANI
Sat Jul 21, 2007 4:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Dropped
Replies: 16
Views: 5710

Re: do not understand why this problem occurs

I have put up that as well


vikasjawa wrote:Hi
Just try setting the nullable property of the cloumn to "Yes".Hope so this would work.
by G SHIVARANJANI
Sat Jul 21, 2007 1:10 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Dropped
Replies: 16
Views: 5710

do not understand why this problem occurs

When theres null value in one of the column in a table. when that column value is used any transformer or any other stage.

the row which has null value in a column is being dropped.

please help
by G SHIVARANJANI
Fri Jul 20, 2007 4:09 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Records Dropped
Replies: 16
Views: 5710

i am not using. it any where.

i ll change the query.

balajisr wrote:Are you using the field 'LAST_UPDATE_DATE' in the transformer?

If so, post the derivation in which you had used the above field.