Node Problem in the job

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
ashik_punar
Premium Member
Premium Member
Posts: 71
Joined: Mon Nov 13, 2006 12:40 am

Node Problem in the job

Post by ashik_punar »

Hi Everyone,

In my job i am having the following job design. I am reading the data from a sequential file and passing the same through the transformer stage. In the transformer stage based on some in column i am routing the input records to 3 outputs. One for insertion if the record is a new record, one for deletion and one for updation of records. Before inserting i am sorting the records in order to remove duplicates from them. The job design look some what like this:

|---->ODBC Stage (Deletion of Records)
Sequential File->Transformer|->Sort Stage->ODBC Stage (For Insertion)
|---->ODBC Stage (For Updation of Records)

I am having 4 nodes, now the problem is that when i am using all the nodes every 4 record is not getting updated. when i using 3 of them then very 13th record is not getting update. The job is running fine when i am using only one node.
Can anyone please tell me what could be the possible reason for this and what steps i can take in order to run the job on all the available nodes.

Thanks in advance,
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

You mentioned a commit frequency of 500. If you change that to a commit frequency of 1 and run the job with a 4-way configuration, does the error disappear, remain the same, or change?
ashik_punar
Premium Member
Premium Member
Posts: 71
Joined: Mon Nov 13, 2006 12:40 am

Post by ashik_punar »

When i am giving the commit size as 100 with a 4-way run,its working fine. But with a commit size of 500 its not working fine. The total no of records that needs to be updated is 1407.
Post Reply