How to prevent truncating before data extraction?
Moderators: chulett, rschirm, roy
How to prevent truncating before data extraction?
Hi,
I have a issue. SQL query performs for a long time from DB2 Connector stage and the target table is truncated before data extraction. During this time the target table is empty. How to prevent truncating the target table before data extraction from source table by using only one parallel job?
The job:
Source_DB2_Connector----->Target_DB2_Connector
Thanks!
I have a issue. SQL query performs for a long time from DB2 Connector stage and the target table is truncated before data extraction. During this time the target table is empty. How to prevent truncating the target table before data extraction from source table by using only one parallel job?
The job:
Source_DB2_Connector----->Target_DB2_Connector
Thanks!
So... you are looking for a way to truncate the target table only once the data is ready to be loaded so it doesn't spend so long empty? There really isn't a way to do that 'using only one parallel job' as to do that you'd need to extract your source data elsewhere, say to a dataset and then once complete, kick off the truncate and load. But that means two jobs.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Once the query has executed - how long is it taking for data transfer?
If all you are after is not having the table empty for so long, is it silly to ask if your query is optimised?
Something also that I have wanted to try is, if you have an output reject link from a DB stage which feeds as a reference link into a lookup with entire as the partitioning, will a Parallel job 'wait' for the data before outputting anything? If that was to work, then you could write the data to temp table, then maybe write a routine that that calls the table truncation from the OS only once data has got to that point? Not convinced it would work though, but you could try.
EDIT:
The more I think about it the question arises: why can you not have 2 jobs?
Can you have a temporary table in your target DB? If so - run your job into the temporary table, and then once the job is finished, execute a stored procedure as either after SQL or as an ofter job routine, to copy the data from the temp table to you target table? That would get it done in one job.
If all you are after is not having the table empty for so long, is it silly to ask if your query is optimised?
Something also that I have wanted to try is, if you have an output reject link from a DB stage which feeds as a reference link into a lookup with entire as the partitioning, will a Parallel job 'wait' for the data before outputting anything? If that was to work, then you could write the data to temp table, then maybe write a routine that that calls the table truncation from the OS only once data has got to that point? Not convinced it would work though, but you could try.
EDIT:
The more I think about it the question arises: why can you not have 2 jobs?
Can you have a temporary table in your target DB? If so - run your job into the temporary table, and then once the job is finished, execute a stored procedure as either after SQL or as an ofter job routine, to copy the data from the temp table to you target table? That would get it done in one job.
The original SQL executed for minimum 20 minutes and I optimized it and now it executes for minimum 10 minutes. But it is minimum time and varies according to workload.ShaneMuir wrote:Once the query has executed - how long is it taking for data transfer?
If all you are after is not having the table empty for so long, is it silly to ask if your query is optimised?
Yes, of course I know that it is easy to solve the issue by using 2 jobs or a temporary staging table but I can change only one parallel job. I'veShaneMuir wrote:The more I think about it the question arises: why can you not have 2 jobs?
Can you have a temporary table in your target DB? If so - run your job into the temporary table, and then once the job is finished, execute a stored procedure as either after SQL or as an ofter job routine, to copy the data from the temp table to you target table? That would get it done in one job.
implemented the job by using Change Capture stage and it seems that the job works fine.
Thanks to everyone for the tips.
That seems like a very... odd... restriction. And the mechanism you were 'forced' to use (Change Capture with INSERT/UPDATE/DELETE) is pretty much the polar opposite of a simple truncate and load. But if that is working for you without 'impacts on performance' then yay.Palermo wrote:I can change only one parallel job
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: