Page 1 of 1

Errors only when running job using dsjob

Posted: Mon Aug 31, 2015 8:22 pm
by babbu9
Hi
I have designed a datastage job that ran fine without any warnings but when I run the same job through unix command line (dsjob) it is showing multiple warnings and aborting.

Warnings say that the a certain field value is null and the record is dropped.

why does this happen only when run through dsjob.

Also how do I make sure that a job is in runnable state before I start it through dsjob. I want the jobs to run irrespective of whether they ran successfully or not the last time.

Please inform.

Posted: Mon Aug 31, 2015 10:29 pm
by chulett
Don't have any thoughts on the first question right now but the second about making sure a job is in a runnable state - how are your scripting skills? Typically one would build a 'wrapper' script to run any DataStage job and part of that would to be check the current status of the job using -jobinfo and then doing a RESET before the RUN if needed.

I believe there are a number of example scripts people have posted here over the years you could search for...

Posted: Mon Aug 31, 2015 11:48 pm
by ray.wurlod
Presumably the file name, or part of it, is parameterised. Examine the job log to ensure that the parameter value is being passed correctly from the dsjob command.

Make sure, too, that you're comparing apples with apples; that both jobs are processing the same file. Verify how null handling is configured in the Sequential File stage.