Hi All,
I have stumbled across some very strange behaviour using an ODBC stage in a Server job that I hope someone might be able to help with.
Basically, the job I have takes an input sequential file and loads to two different ODBC stages. One table contains all the fields in the input, the second contains only the first field.
Now, the load of the full file is fine, but attempting to load just the first field (Code) in the other table throws up loads of warnings.
If I try doing Update, then insert - using Datastage generated SQL, it looks like this:
UPDATE TQA_Ratios_Change SET WHERE (Code = ?);
INSERT INTO TQA_Ratios_Change(Code) VALUES (?);
This fails with the following:
PRICE_Stg10_ODBC_Load..Transf.LnkDataChanges: DSD.BCIPut call to SQLExecute failed.
SQL statement:INSERT INTO TQA_Ratios_Change(Code) VALUES (?)
SQLSTATE=23000, DBMS.CODE=2627
[DataStage][SQL Client][ODBC][DataDirect][ODBC SQL Server Driver][SQL Server]Violation of PRIMARY KEY constraint 'PK__TQA_Ratios_Chang__1A54DAB7'. Cannot insert duplicate key in object 'TQA_Ratios_Change'.
SQLSTATE=, DBMS.CODE=3621
[DataStage][SQL Client][ODBC][DataDirect][ODBC SQL Server Driver][SQL Server]The statement has been terminated.
I thought that the problem was with bad SQL - ie. nothing between 'SET' and 'WHERE', so added the following in User-Defined SQL:
UPDATE TQA_Ratios_Change SET Code = ? WHERE (Code = ?);
INSERT INTO TQA_Ratios_Change(Code) VALUES (?);
And reran...and got:
PRICE_Stg10_ODBC_Load..Transf.LnkDataChanges: DSD.BCIPut call to SQLExecute failed.
SQL statement:UPDATE TQA_Ratios_Change SET Code = ? WHERE (Code = ?);
INSERT INTO TQA_Ratios_Change(Code) VALUES (?)
SQLSTATE=S1009, DBMS.CODE=0
[DataStage][SQL Client][ODBC][DataDirect][ODBC lib] Invalid argument value
So, to sum up, I am wanting to store the first field (Code) in the table TQA_Ratios_Change - if it exists, I actually want to ignore (but update is fine), if it doesn't exist, then insert.
I had a good search through, but couldn't find anything similar that helped. Any ideas gratefully appreciated as I am pulling my hair out here! :evil:
Many thanks in advance.
ODBC Stage Warnings
Moderators: chulett, rschirm, roy
Thanks Brian.
I don't really want to update the key - I really want to just say:
Is code field on table? If yes, ignore and continue, if no... store the new code in the table.
Basic overview is that the larger table is updated everyday so that it will always contain data for all codes. The single field table is cleared every day and only used to store the code where data for that code has come in today. It is updated in three jobs - first job clears down and populates, the next two append codes that weren't added in the other jobs...
Thanks
I don't really want to update the key - I really want to just say:
Is code field on table? If yes, ignore and continue, if no... store the new code in the table.
Basic overview is that the larger table is updated everyday so that it will always contain data for all codes. The single field table is cleared every day and only used to store the code where data for that code has come in today. It is updated in three jobs - first job clears down and populates, the next two append codes that weren't added in the other jobs...
Thanks
As well as writing to the table, write the code values to a hashed file with code as key. In the transformer before your OCI stage perform lookup to hashed file to check for code value, if it exists don't write to the table, if it doesn't exist write to the table.
With multiple jobs writing to the same hashed file and reading from it at the same time be careful with read/write cache. Either disable it all, or if you
Allow stage write cache
you should set the read to
Pre-load file to memory - Enabled lock for updates, or Dissabled lock for updates.
With multiple jobs writing to the same hashed file and reading from it at the same time be careful with read/write cache. Either disable it all, or if you
Allow stage write cache
you should set the read to
Pre-load file to memory - Enabled lock for updates, or Dissabled lock for updates.
Regards,
Nick.
Nick.
Work Around
Hi All,
I have managed to work around this by doing the job in EE instead. Same SQL, same job set up.. but in EE it works! :shock:
I have managed to work around this by doing the job in EE instead. Same SQL, same job set up.. but in EE it works! :shock: