Job with Merge Stage Abort

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
thurmy34
Premium Member
Premium Member
Posts: 198
Joined: Fri Mar 31, 2006 8:27 am
Location: Paris

Job with Merge Stage Abort

Post by thurmy34 »

Hi All
I try to merge two files (100 Mo / 600 Go) , the join is COMPLETE SET.
After dealing +/- 500000 rows the job abort with this messages :

Code: Select all

Function 'DSCSaveRowToBuf'  failed
Function 'get_next_output_row'  failed Arithmetic overflow
The job works fine with smaller files.
Is it a size problem or can it be a data problem ?

Thank you
Hope This Helps
Regards
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I don't know what could be causing that offhand. Does it abort at exactly the same row each time (even if the data sort order is different)?
"600 Go" does that mean Gigabyte?
What is your output stage after the merge stage?
thurmy34
Premium Member
Premium Member
Posts: 198
Joined: Fri Mar 31, 2006 8:27 am
Location: Paris

Post by thurmy34 »

One of the files was 1 Go and now is 600 MO (sorry about that).
The job abort at the same row but i didn't try an another sort (i'll try and keep you posted).
The first stage after the merge is a transform who select the row to send to a hashFile.
I tried with a Sequential after the merge and the job still abort.
Hope This Helps
Regards
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Stick with a sequential file output for the moment to simplify the problem. Do "Go" and "Mo" mean "gigabyte" and "megabyte" or something else?
How big is the output file when the job aborts?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

The Server Merge stage uses hashed files under the covers to do the actual work, so size would very well be the issue. How 'wide' are your records?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

(Yes, Mo - Megabyte and Go = Gigabyte. These are fairly commonly used abbreviations in southern Europe particularly.)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
thurmy34
Premium Member
Premium Member
Posts: 198
Joined: Fri Mar 31, 2006 8:27 am
Location: Paris

Post by thurmy34 »

Hi All
First of all i'm sorry for not answering earlier.
My job works fine now, there are the modifications :
I doubled check the format of the datas -> two columns were wrong in the merge stage.
I reduced the size of the rows (and of the files ) by selecting only the useful columns when i build the files.

Thank you for your help
Post Reply