Search found 222 matches

by rsunny
Tue Jun 19, 2012 8:25 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Numeric value issue
Replies: 8
Views: 3837

Hi craig, I did a search on "scientific notation" but couldnt able to find a solution. One solution i have in my mind is for example if i am getting the value as 7.61778e+06 then i can take the number before 'e' and multiply by 10 and the number after + which is '6' i.e 7.61778*10*06. Can ...
by rsunny
Tue Jun 19, 2012 4:36 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Numeric value issue
Replies: 8
Views: 3837

Numeric value issue

Hi, I am trying to generate a unique sequence number and appending that value to a string and the data type of that column is varchar. For example the value would be 'ABC_123', but for my scenario the value is populating as 'abc_7.61778e+06' instead of 'abc_7617780' Can you please help me to get the...
by rsunny
Thu Jun 07, 2012 6:46 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Consumed more than 100000 bytes looking for record delimite
Replies: 9
Views: 4054

Hi ,

When i do wc -l filename i got the value as 3028799 .

Even though if i use a reject link for the Sequential stage , the job is getting Aborted.

Is there any possible solution to reject that record instead of Aborting the job?
by rsunny
Wed Jun 06, 2012 8:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Consumed more than 100000 bytes looking for record delimite
Replies: 9
Views: 4054

Hi craig,

I mentioned record delimeter as Unix new line and ran the job but still got aborted
by rsunny
Wed Jun 06, 2012 4:47 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Activity count zero for Stream Operator update or delete
Replies: 7
Views: 5354

The only way i could able to resolve is seperated the links for Update and Insert instead of using upsert.
by rsunny
Wed Jun 06, 2012 4:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Consumed more than 100000 bytes looking for record delimite
Replies: 9
Views: 4054

Consumed more than 100000 bytes looking for record delimite

Hi, When i try to run the job i am getting the error as "Consumed more than 100000 bytes looking for record delimiter; aborting". Source is Sequential File and the Final Delimiter : End , Field Delimiter:comma , Null Field Value:" and Quote: double. I have created a user defined varia...
by rsunny
Tue Jun 05, 2012 7:07 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Activity count zero for Stream Operator update or delete
Replies: 7
Views: 5354

Is there any possibility to avoid that error by using any settings in teradata connector stage?
by rsunny
Mon Jun 04, 2012 9:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Activity count zero for Stream Operator update or delete
Replies: 7
Views: 5354

The job got Aborted due to below error in Log

Activity count zero for Stream Operator update or delete. (CC_TeraAdapter::selectStreamErrorTable1, file CC_TeraAdapter.cpp, line 7,599)
by rsunny
Mon Jun 04, 2012 7:39 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Activity count zero for Stream Operator update or delete
Replies: 7
Views: 5354

Can any one please let me know the solution for the above issue.
by rsunny
Sun Jun 03, 2012 4:56 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Activity count zero for Stream Operator update or delete
Replies: 7
Views: 5354

Activity count zero for Stream Operator update or delete

Hi , My target(Teradata) is Upsert and Access Method is Bulk and Load Type is Stream My Source is Teradata and the lookup is a file . When i try to trunctae the target table and run the job. I am getting the error as "[IIS-CONN-TERA-005003] RDBMS code 9903: Activity count zero for Stream Operat...
by rsunny
Sat Sep 10, 2011 12:09 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record Dropped
Replies: 18
Views: 10196

I have provided the condition If (ISNULL(DSLink18.sourcerow) Or DSLink18.sourcerow='') Then 1 Else DSLink18.sourcerow
but still getting the below error


Transformer_40,0: Field 'sourcerow' from input dataset '0' is NULL. Record dropped.
by rsunny
Sun Sep 04, 2011 6:09 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record Dropped
Replies: 18
Views: 10196

Ravi.K wrote:Display for us how you are handled Null.

IF ISNULL(DSLink18.copysno) THEN NULLTOVALUE(DSLink18.copysno,1) ELSE DSLink18.copysno or if DSLink18.copysno='' then '1' else DSLink18.copysno
by rsunny
Sun Sep 04, 2011 12:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record Dropped
Replies: 18
Views: 10196

I disabled operator combination.I got the below error. Transformer_40,0: Field 'sourcerow' from input dataset '0' is NULL. Record dropped. I am getting the error in the transformer , but i tried using Null handling functions but still getting the error. Any help is really appreciated. Thanks in adva...
by rsunny
Sat Sep 03, 2011 2:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record Dropped
Replies: 18
Views: 10196

Re: Record Dropped

soumya5891 wrote:Is the field on which you are checking null is varchar ?
Yes. I checked with Datatype "Double" also.
by rsunny
Sat Sep 03, 2011 11:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Record Dropped
Replies: 18
Views: 10196

Record Dropped

Hi, I am having a small job where i am having Null values which is passed through Transformer and then to Sequential file.I have handled Null values in the stage variables in transformer like IsNUll(Column),NullTOValue(),NullTOZero(),Column='', but still getting the below error. "APT_CombinedOp...