Search found 452 matches

by kaps
Fri Mar 18, 2011 9:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading DB2 sequencers during Bulk Load....
Replies: 13
Views: 5386

Reading DB2 sequencers during Bulk Load....

We have a job which does inserts to a table reading from a file. Job design is this... SeqFile----LookupStage----Xformer----DB2APIStage This job is failing because of the error... Lukup,0: Could not map table file "/datastage/prd/Datasets/lookuptable.20110318.feldurb (size 1134819352 bytes)&quo...
by kaps
Mon Mar 14, 2011 1:47 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Issue in loading data using DB2UDB Enterprise Stage...
Replies: 4
Views: 2700

There are only two columns coming through...One is BigInt and the other is Varchar(1000) which is same as the Target. As I said before no null values in either column.

Only thing is key not defined in the Target table. Is that an issue ?
by kaps
Mon Mar 14, 2011 1:45 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Issue in loading data using DB2UDB Enterprise Stage...
Replies: 4
Views: 2700

There are only two columns coming through...One is BigInt and the other is Varchar(1000) which is same as the Target. As I said before no null values in either column.

Only thing is key not defined in the Target table. Is that an issue ?
by kaps
Mon Mar 14, 2011 1:16 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Issue in loading data using DB2UDB Enterprise Stage...
Replies: 4
Views: 2700

Issue in loading data using DB2UDB Enterprise Stage...

We are having issues while trying to load data using DB2UDB Enterprise stage. Job design as follows: DB2APIstage---Xfmer---Remove Duplicate----Db2UDBEnterprise We get the following error: main_program: DB2 Get Table Partitioning Information. main_program: SQLCODE = -303; SQLSTATE=42806 main_program:...
by kaps
Tue Mar 01, 2011 5:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Table with varying number of columns...
Replies: 4
Views: 1785

I am going to go with one big column to hold the values. Thanks for the reply.
by kaps
Tue Mar 01, 2011 5:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Achive bellow logic in datastage
Replies: 6
Views: 5446

Use Transformer and Remove duplicate stage. In Transformer, have stage variables to store previous values for code and key values. sort the input.
Whenever keys are same keep appending to the stage variable. In remove dupliate stage get the last record from the group.
by kaps
Mon Feb 28, 2011 11:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Table with varying number of columns...
Replies: 4
Views: 1785

Table structure is : Key,Val1,Val2,Val3,Val4,... As of now as per the data I can see a max of 78 columns. I can achieve this by creating a table with 100 columns and then populate it but it will be an issue if the number of columns exceed 100. So I am looking for a way where it has to take the max o...
by kaps
Mon Feb 28, 2011 4:50 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Table with varying number of columns...
Replies: 4
Views: 1785

Table with varying number of columns...

Hi I have a requirement which says that I have to first pivot a table and then insert them to another table. Let us say I have two columns in the input table ID, Value. One of the ID can have upto 80 records in the table. I have used vertical pivot technique to flatten the values out. Now in my outp...
by kaps
Tue Feb 15, 2011 9:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: What's happening when we use join stage instead of lookup ?
Replies: 6
Views: 8824

Scratch and Resource disk is pointing to /datastage node in the server which has over 200 GB left. /tmp has almost 1 GB left. Server has 8 GB of Memory.

Table has 23 million records and I am just selecting two fields from that.
by kaps
Mon Feb 14, 2011 5:57 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: What's happening when we use join stage instead of lookup ?
Replies: 6
Views: 8824

First, Thanks for the replies. Andy/Ray - I chose AUTO so that it does the sorting and key partition by itself. So can I say like "When we select AUTO for parition method, the lookup records gets loaded in memory" before moving to downstraem stages ? If that's the case then how does it dif...
by kaps
Mon Feb 14, 2011 12:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: What's happening when we use join stage instead of lookup ?
Replies: 6
Views: 8824

What's happening when we use join stage instead of lookup ?

I am just wondering what's happening on the background when we use Join stage instead of Lookup stage? We had a failure saying insufficient disk space when we used lookup stage to do the lookup to a DB2 table. Though it says it's a space issue it's not as we have lot of space left and also lot of me...
by kaps
Tue Dec 14, 2010 9:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Could not load "V10S0_JobName"
Replies: 14
Views: 8263

Anyone has clue about this ?
by kaps
Wed Dec 08, 2010 6:34 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Could not load "V10S0_JobName"
Replies: 14
Views: 8263

My db2nodes.cfg file look like this... 1 pmudb05t 1 2 pmudb05t 2 My Apt_Config file look like this... { node "node0" { fastname "pmetl05t" pools "" "node1" "pmetl05t" "mnode" resource disk "/datastage/tst5/dmttst5/Datasets" {pools...
by kaps
Wed Dec 08, 2010 11:06 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Could not load "V10S0_JobName"
Replies: 14
Views: 8263

Ray I did as you suggested. I have added an environment variable called LD_LIBRARY_PATH in Administrator and added the same in the job and set it's value to $UNSET. Now I don't see the variable in my job log but I still get the same error. Anything else I need to check ? Any input is appreciated. Th...