Design:
-Oracle Enterprise stage is retreiveing this decimal38 number:18446744073709551616. It is deliberately greater than the largest unsigned BigInt.
-This number is sent to Modify stage. An explicit conversion to BigInt is performed there (uint64_from_decimal)
-At the end this is sent to a Sequential File stage.
I do get a warning of possible range limitation, but the job doesn't abort.
It creates the file with all other fields looking good, and the bad one is empty. Actually it is not empty - DataStage thinks it's null. How do I know that? If I choose nullable=No for that column then I get "Null in field CHARGE_SEQUENCE Result is non-nullable and there is no handle_null to specify a default value" fatal error. It is not null, it is out of range.
I want the job to abort if a number is out of range. Otherwise I get the wrong data.
So, is there some setting to make it abort?
Out of range number not giving an error
Moderators: chulett, rschirm, roy
Out of range number not giving an error
If it is giving warning for this problem - you can set the job to Abort after warnings, this will make it abort. Try providing Null Field value and see what output it is giving.
a)This warning is here regardless of what the value is . So if I was to abort the job after a warning, then it would always abort - not just when an out-of-range value is encountered.
b)I have a null field value. It is displayed in both cases:
1) it really was null
2) it was out of range
So one cannot be sure that a null in the output file really comes from a null.
b)I have a null field value. It is displayed in both cases:
1) it really was null
2) it was out of range
So one cannot be sure that a null in the output file really comes from a null.