Jobs getting stuck

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

So no other go, you need to check whats after 14M records. Try to split the file by some means for testing purpose to know what went wrong. Run the second half and see if your machine react the same way.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
asitagrawal
Premium Member
Premium Member
Posts: 273
Joined: Wed Oct 18, 2006 12:20 pm
Location: Porto

Post by asitagrawal »

I agree with you...
Since the problem appears only in this particular set of input...
so initial doubt is that the data is corrupt.

In attempts to split the file or say read data after the 14M using Ascential,
the problem appears, so I am now using MS-DOS commands to split the file.. which also is been done. Today I will run the tese again and get back with the results !!
asitagrawal
Premium Member
Premium Member
Posts: 273
Joined: Wed Oct 18, 2006 12:20 pm
Location: Porto

Post by asitagrawal »

the data was found to be corrupt... but the DataStage was not logging any metadata problem while reading thru the 30M file.
After separating the data... a warning was logged....
Share to Learn, and Learn to Share.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

May be the processed rows where huge in number that the process has consumed the full paging space and couldn't even log the error. :wink:
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
Post Reply