Inserting record - two behaviours

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

pxraja
Participant
Posts: 142
Joined: Fri Mar 14, 2008 10:22 am

Inserting record - two behaviours

Post by pxraja »

Hi all,

I am wondering how the warnings are thrown, while inserting records. there are two behaviours one is, while running the job for custom query with update and insert login on target table, some records are rejected with the warnings "ORA-00932: inconsistent datatypes: expected NUMBER got DATE" SQL execute failed.

behaviour two: when i am trying to execute for the rejected record alone, job is inserting record in the target table without any warnings???

But job has to insert and update the records as a whole.

the source query fetches all the records without any error or warnings, 10% of 5 million records are rejected. I had tried to insert for one rejected record. Job has inserted successfully without any warnings.


Please let me know any workaround or what is the solution to overcome this behaviour?

thanks in advance.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I would suspect it's an issue with your 'custom query', that and loading multiple records with the same keys in a run. Why custom rather than generated? Care to post it?
-craig

"You can never have too many knives" -- Logan Nine Fingers
pxraja
Participant
Posts: 142
Joined: Fri Mar 14, 2008 10:22 am

Post by pxraja »

chulett wrote:I would suspect it's an issue with your 'custom query', that and loading multiple records with the same keys in a run. Why custom rather than generated? Care to post it? ...
SELECT TBL1.COL1,TBL1.COL2,TBL1.COL3,TBL2.COL4,TBL2.COL5,
TBL2.COL6,TBL2.COL7,TBL2.COL8
FROM TBL1,TBL2
WHERE TBL1.COL1=TBL2.COL1
AND TBL1.COL2 =TBL2.COL2
and tbl1.COL2=*******

query is not that much complex and no transformation in the tranasformer stage. It is the production job running successfully. Nothing has been changed so far...

DataStage Job 1430 Phantom 28816
WARNING: Attempt to extend file "RT_LOG1430/DATA.30" beyond the 2 gigabyte file limit.
32-bit static hashed and dynamic files cannot exceed a file size
of 2GB. Use a Distributed or 64-bit file.
Attempting to Cleanup after ABORT raised in stage Jobname..Trf_2
DataStage Phantom Aborting with @ABORT.CODE = 3

the above error was thrown when I am trying to rerun the job after compling it. After that I had copied the job and saved with different name and tried to run the job for rejected row .. this time job was successful and the rejected rwo got inserted in the target without any warnings.

but the question is why the following error is coming when trying for full load.

: DSD.BCIPut call to SQLExecute failed.
SQL statement:INSERT INTO JOBname(....)SQLSTATE=S1000, DBMS.CODE=932
[DataStage][SQL Client][ODBC][DataDirect][ODBC Oracle Wire Protocol driver][Oracle]ORA-00932: inconsistent datatypes: expected NUMBER got DATE


as the error says : there is no datatype mismatch because it is successful when inserted as single record..

how to rectify this error..
thanks in advance
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I meant the "update and insert" dml not the source query, I thought that was your "custom" one. And your phantom is strictly from blowing the logs over 2GB in size just as it states, probably by running with unlimited warnings allowed.
-craig

"You can never have too many knives" -- Logan Nine Fingers
pxraja
Participant
Posts: 142
Joined: Fri Mar 14, 2008 10:22 am

Post by pxraja »

chulett wrote:I meant the "update and insert" dml not the source query, I thought that was your "custom" one. And your phantom is strictly from blowing the logs over 2GB in size just as it states, probably by runni ...
There is no custom query for update and insert on the target ODBC stage.

ya that is due to logs exceeding the 2GB limit but how to clear those logs while trying to clear the mentioned error occurs

Error selecting from log file RT_LOG1430
Command was: SSELECT RT_LOG1430 WITH @ID LIKE '1N0N' COUNT.SUP
Error was: Internal data error. File '/Projects/../RT_LOG1430/DATA.30': Computed blink of 0x830 does not match expected blink of 0x0! Detected within group starting at address 0x3298800!


there are two issues to be solved..

one is to insert the records without the warnings like 'ORA-00932: inconsistent datatypes: expected NUMBER got DATE'

and another is to clear the logs for the above job which throws the above mentioned error

please give your suggestions how to overcome the above mentioned..

thanks in advance
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Your log file is corrupted and the quickest way to get back up and running is to use the Director and clear all log file entries; this triggers a TCL command of "CLEAR.FILE" which will reset the log file and get rid of your blink errors.
If you are 100% certain that this log file should grow past ~2Gb then you can, from TCL, do a "RESIZE RT_LOGnnn * * * 64BIT". Note that performance might suffer. Once again, I have severe reservations that the log file needs to be this big, it would be much more efficient to put in an auto-purge setting to limit the size automatically.
pxraja
Participant
Posts: 142
Joined: Fri Mar 14, 2008 10:22 am

Post by pxraja »

ArndW wrote:Your log file is corrupted and the quickest way to get back up and running is to use the Director and clear all log file entries; this triggers a TCL command of "CLEAR.FILE" which will reset the log f ...
I had used CLEAR.FILE &PH& but in vain, Its ok i have copied in to new job and again renamed it as same after deleting the problematic.

Now I need to solve one thing, that is... to avoid those warnings and without any rejected rows..

please help me to solve this if you have any ideas..

thanks in advance
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Doing a "CLEAR.FILE &PH&" has nothing to do with your issue. The "&PH&" hashed file (actually stored as a project subdirectory) contains only Tee'd output from background processes. I would caution you against using the CLEAR.FILE command directly. You should stick with the director front end method unless you know what you are doing.
pxraja
Participant
Posts: 142
Joined: Fri Mar 14, 2008 10:22 am

Post by pxraja »

ArndW wrote:Doing a "CLEAR.FILE &PH&" has nothing to do with your issue. The "&PH&" hashed file (actually stored as a project subdirectory) contains only Tee'd output from background processes. I ...
Thanks for your information andrew

my concern is to avoid the warnings and to populate records without any rejected rows.

please suggest on the error mentioned above.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Re: Inserting record - two behaviours

Post by chulett »

pxraja wrote:while running the job for custom query with update and insert login on target table
This confusing statement was why I thought you were using custom sql on the target. :?

Best way to 'avoid' these kind of issues is by pre-qualifying your actions using a reference hashed lookup and two dedicated target links - one for insert and one for update. Those 'combo' actions are a crutch IMHO and allow 'shortcuts' to be taken in job designs.

BTW - You need to figure out how you managed to get a DATE in a NUMBER field, we can't. As noted earlier, check to see how many times the key combination of the problematic record(s) occurs in your input data set.
-craig

"You can never have too many knives" -- Logan Nine Fingers
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

ps. You should have done a CLEAR.FILE on your log file name as noted in the error, not the completely unrelated "phantom" directory.
-craig

"You can never have too many knives" -- Logan Nine Fingers
pxraja
Participant
Posts: 142
Joined: Fri Mar 14, 2008 10:22 am

Post by pxraja »

chulett wrote:ps. You should have done a CLEAR.FILE on your log file name as noted in the error, not the completely unrelated "phantom" directory. ...
how to find the log file name for the particular Job?

please give the full syntax for using CLEAR.FILE
pxraja
Participant
Posts: 142
Joined: Fri Mar 14, 2008 10:22 am

Re: Inserting record - two behaviours

Post by pxraja »

BTW - You need to figure out how you managed to get a DATE in a NUMBER field, we can't. As noted earlier, check to see how many times the key combination of the problematic record(s) occurs in your input data set.[/quote]

I have used hash file stage to capture rejected records and viewed the data.. but the record are found to be valid without any data mismatch here i am confused why the records are rejected?

how to avoid the errors and pass all the valid records?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You already know the "full syntax", you've used it correctly just on the wrong "file". Your log file's name was explicitly mentioned in the error message, however to look it up simply query DS_JOBS:

Code: Select all

SELECT JOBNO FROM DS_JOBS WHERE NAME = '<YourJobName>'
-craig

"You can never have too many knives" -- Logan Nine Fingers
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Use a Sequential File stage to capture rejects, not a Hashed File, you could be losing records due to the 'destructive overwrite' nature of hashed files depending on the keys you've defined.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply