If you are not contsrained with using just a single job
then preferable solution would be writing values in some intermediate dataset in JOB 1
and in JOB 2 , Intermediate dataset data can be appended to Required dataset.
Search found 11 matches
- Thu Apr 19, 2012 10:46 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Use Virtual dataset as reference in lookup
- Replies: 2
- Views: 4014
- Thu Apr 19, 2012 10:36 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: An unidentified error has occured in lookup stage
- Replies: 2
- Views: 3215
Re: An unidentified error has occured in lookup stage
Hi All, I am trying to compile my job but it says "An unidentified error has occured" when I clicked on show error button its pointing to Lookup stage in which I am looking up on two fields and making a reject link if lookup failure and writing the rejects to a sequential file. I tried to...
- Thu Apr 19, 2012 10:26 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: LastRowInGroup() function not working all of a sudden
- Replies: 6
- Views: 6124
Re: LastRowInGroup() function not working all of a sudden
Try converting your sort / partition keys to not nullable and then try once.
- Wed Apr 18, 2012 3:01 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DB2 connector stage -911 deadlock error
- Replies: 8
- Views: 11560
Re: DB2 connector stage -911 deadlock error
- better to run updates if any, in serial mode
OR
- Perform inserts and updates in 2 different jobs
OR
-Possibly Hash partitioned data before load might help as well in avoiding parallel updates
OR
- Perform inserts and updates in 2 different jobs
OR
-Possibly Hash partitioned data before load might help as well in avoiding parallel updates
- Wed Apr 18, 2012 2:51 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Find the character in a string
- Replies: 11
- Views: 16859
Re: Find the character in a string
Try using FIELD function with delimiter string as "TS"
Lather you can concatenate 2nd part of your string for the concatenation
Ex.
Link.Column[1,2]:FIELD[Link.Column,"TS",2]
Let me know if it works out...
Lather you can concatenate 2nd part of your string for the concatenation
Ex.
Link.Column[1,2]:FIELD[Link.Column,"TS",2]
Let me know if it works out...
- Wed Apr 18, 2012 2:46 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Modify Stage Issue
- Replies: 11
- Views: 6770
Re: Modify Stage Issue
Could you make sure if the input column passed to the modify is not nullable.
- Wed Apr 18, 2012 2:41 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Invalid timestamp value
- Replies: 14
- Views: 11364
- Wed Apr 18, 2012 2:38 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: trim function not working
- Replies: 4
- Views: 4413
Re: trim function not working
Simply assign the value to a stage variable with type VARCHAR(2) and use it in the output column of type VARCHAR(2) with trim function
- Wed Apr 18, 2012 2:36 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: decimal issue
- Replies: 7
- Views: 3087
Re: decimal issue
In fact try assigning your string value to a decimal type stage variable....
Datastage would implicitly handle it.
Datastage would implicitly handle it.
- Wed Apr 18, 2012 2:31 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: decimal issue
- Replies: 7
- Views: 3087
Re: decimal issue
You may try DecimalToDecimal() In the stage variables and assign the variable value to the column
- Wed Apr 18, 2012 2:25 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Unable to insert records into oracle table
- Replies: 4
- Views: 2895
Re: Unable to insert records into oracle table
By default it should insert both the records.
- Make sure you are not setting unique constraint / key constraint on columns which might avoid duplicates
Have you used Sort + Aggregate in the job before load operation
- Make sure you are not setting unique constraint / key constraint on columns which might avoid duplicates
Have you used Sort + Aggregate in the job before load operation