Job status is successful, even with fatal error in it

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
kishorenvkb
Participant
Posts: 54
Joined: Mon Dec 24, 2007 9:27 am

Job status is successful, even with fatal error in it

Post by kishorenvkb »

Hello,

We have a parallel job that had a fatal error in it (when looked into the log), but still had a statement "Sucessfully Finished" at the end of the job log. Why would it do that? Has anyone see this before? Are there any configuration settings that I am missing?
Not all the jobs are ignoring the fatal errors. If there is a fatal error, the job status shows that it is aborted.
The kind of fatal error that I got and still the job finished successfully was ""TDEnt_Store_Inventory,0: TeraGenericQuery Error: DB Call Failure(success check) Info = 0, Code = 2616, Message = Numeric overflow occurred during computation. Session return code = 2,616 , DBC return code = 0
DBC message: Completed successfully.
TeraGenericQuery Error: DB Call Failure(success check) Info = 0, Code = 2616, Message = Numeric overflow occurred during computation. Session return code = 2,616 , DBC return code = 0
DBC message: Completed successfully."

Any help regarding this is greatly appreciated
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Post by bcarlson »

We ran into the same issue. Are you by any chance doing work with large decimals (>18 digits) in Teradta V2R6.2?

We are on DS v8.0.1. There was a patch that we had to install to support BIGINT and large decimals in TD connector. That patch didn't resolve our large decimal errors, but did make the job fail correctly when it encountered the 'numeric overflow' error.

Brad.
It is not that I am addicted to coffee, it's just that I need it to survive.
kishorenvkb
Participant
Posts: 54
Joined: Mon Dec 24, 2007 9:27 am

Post by kishorenvkb »

Yup that is exactly the situation (> 18 digits). We are on 7.5.1a. Do you happen to know the patch number for this?

Thanks
bcarlson
Premium Member
Premium Member
Posts: 772
Joined: Fri Oct 01, 2004 3:06 pm
Location: Minnesota

Post by bcarlson »

Unfortunately, we have found no patch to make 7.5.1a work with large decimals in Teradata - and we have searched hard. We are now in the process of upgrading to v8.0.1 and using the Teradata Connector, which uses Teradata's Parallel Transporter. And we are still facing issues.

Related sites from our ongoing 'tour of duty' to get large decimals to work...

DataStage PX and Teradata V2r6.2 large decimals
DS v8 and Teradata V2R6.2 large decimals

We had a workaround that we used for testing, but since we are converting our whole warehouse from DB2 to Teradata, we didn't want to have thousands of jobs with workarounds. We opted to upgrade and see if we could get it to work right.... I'll let you know if we do :?

The workaround is to read the data from Teradata as a char field (ex. a decimal(20) is a char(22) - add a byte for sign and decimal), and write it to Teradata as a char field as well. If you need the field as a decimal within DataStage, convert it as needed. This works fine in 7.5.1a, and works fine down the road in v8. Like I said, the reason we chose not to go this route was primarily because of the number of jobs that would have to do this.

Good luck, and it you DO find a way to do the large decimals (in ANY version), please post it!

Brad.
Last edited by bcarlson on Thu Oct 09, 2008 7:58 am, edited 1 time in total.
It is not that I am addicted to coffee, it's just that I need it to survive.
kishorenvkb
Participant
Posts: 54
Joined: Mon Dec 24, 2007 9:27 am

Post by kishorenvkb »

Thanks Brad. Will try the workaround.
Post Reply