Records Rejected, Reason Unknown

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Records Rejected, Reason Unknown

Post by anupam »

Hi All,

I have a server job which does the transformation and creates Dat File for loading Using Orabulk Stage. Some Records are being rejected for some Jobs randomly. The Transformation Job Finishes with Warning. The Warning is :

"DataStage Job 126 Phantom 736
Program "ORABULK.RUN": Line 361, Nonnumeric data when numeric required. Zero used.
Program "ORABULK.RUN": Line 361, Nonnumeric data when numeric required. Zero used.
DataStage Phantom Finished"

I have checked, The No of Records in the Dat file which is created by Transfomation Job is loaded properly to the Table without any records being rejeceted.

Please suggest, how to resolve this problem.
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

That message usually means that a non-numeric value has been found in the data being supplied, where a numeric value was expected. Because the ORABULK stage is written in DataStage BASIC, it replaces that non-numeric value with zero, and issues a warning.
This may not be what you require! :!:
Check the input data, perhaps in the preceding Transformer stage, to ensure that only numeric data are passed on numeric columns. You can use the Num() function as an output constraint, which means that you will trap the non-numerics on a rejects link.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Post by anupam »

Hi Ray,

This is Ok,but even if the Datastage is using null or zero value in those places where it is expecting numeric values, then also the number of records in the input file and the number of records in output file should match.

If i am wrong then please correct me.....
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

It wasn't clear from your original post whether all rows being delivered to the ORABULK stage were being written to the DAT file.
By how many are they different? Two? (My guess is based on two logged "non-numeric where numeric required" messages.)
Would it be difficult to determine which rows were not written to the DAT file, and to inspect these?
You could do it with the Debugger, set the breakpoint expression to something like Not(IsNull(inlink.columnname)) - though you need some insight as to the likely culprit column.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Post by anupam »

They differ by 129 Rows, It's not possible to see the input data as we are having million of records to be processed at any point of time. this Difference also is not constant, In some Job, there is not any difference at all while in other Jobs it somewhere around 1000 records.
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
I'm just throwing off a situation I had and I don't know if you have something similer.
I had an input file with an openning and trailing rows with different length from the rest of the data.
I used some logic to strip them out in a transformer.
all was covered to avoid error messages but still I got the same phantom you got.
the only solution I stumbled across was to output the transformer to another transformer instead of writing it directly to a file.
surprised as I was this wierd phantom went away for good.

if your case is somewhat similer, since I don't have the exact details
and you find no other problems causing the phantom and no other solutions try this.

and let us know when you solve it
IHTH
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

anupam wrote:They differ by 129 Rows, It's not possible to see the input data as we are having million of records to be processed at any point of time. this Difference also is not constant, In some Job, there is not any difference at all while in other Jobs it somewhere around 1000 records.
Do these differences occur with the same set of input data - that is, are you trying to reproduce exactly the same problem? Does the problem occur if the output is a Sequential File stage (which is just as good, if not better, for preparing data for sqlldr)?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
anupam
Participant
Posts: 172
Joined: Fri Apr 04, 2003 10:51 pm
Location: India

Post by anupam »

Hello Ray,

No the data is different as this is a live Job of Production Server.
I will be taking the same set of records in development and then will do the analysis today. Will let u know abt the progress.

If u want me to test something then let me know.
----------------
Rgds,
Anupam
----------------
The future is not something we enter. The future is something we create.
Post Reply