Issue regarding the insert

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
rsunny
Participant
Posts: 223
Joined: Sat Jul 03, 2010 10:22 pm

Issue regarding the insert

Post by rsunny »

Hi every one ,

can any one tell me he proces of how to insert multiple tables at a time and i need to run all the tables say for suppose if we have 10 tables .i need to run all the tables at a time i mean one after other and if i find any error in any table it should not stop executing untill it is executed all the 10 tables and once done i have to get the tables only which failed . here failed means i am checking each table if the records match or not in each table. say for suppose i have 10 tables for source1 and source2 .i need to run all the tables at a time and check if there is an mismatch in each table for their respective sources i.e source1 of table 1 and source 2 of table2.i need the result after executing all the 10 tables.so can any one tell me the process of how to solve this issue?

thanks in advance
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

A sequence containing ten job activities with explicit error handling from each, collecting through a sequencer into final error handling.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
rsunny
Participant
Posts: 223
Joined: Sat Jul 03, 2010 10:22 pm

Post by rsunny »

ray.wurlod wrote:A sequence containing ten job activities with explicit error handling from each, collecting through a sequencer into final error handling. ...

In this case , we know that it is 10 tables but if it is more than 10 tables in another scenario , then we cant create 100 jobs for 100 tables , so i want the solution that it is applicable for any number of tables.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You certainly can create 100 jobs for 100 tables, what would be stopping you? :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
rsunny
Participant
Posts: 223
Joined: Sat Jul 03, 2010 10:22 pm

Post by rsunny »

I need a generalized solution say for suppose today if they are executing for 10 tables, may be tomorrow they might execute 20 tables .so i cant create a sequence every time if the number of tables changes .so i need a genralized one so that if i want to execute 20 tables i can able to execute without creating again 20 jobs.Is this possible?I was wondering can we able to create this kind of job.If so , i need the process.


Thanks in advance
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Explain to us what controls the 'number of tables' you'll be targeting or processing? Perhaps that might help people towards a solution. Also, do these tables have identical metadata?
-craig

"You can never have too many knives" -- Logan Nine Fingers
rsunny
Participant
Posts: 223
Joined: Sat Jul 03, 2010 10:22 pm

Post by rsunny »

chulett wrote:Explain to us what controls the 'number of tables' you'll be targeting or processing? Perhaps that might help people towards a solution. Also, do these tables have identical metadata?
I had created a job with two sources i.e. s1 and s2 and I need to check for each record of s1 and s2 is matching or not matching. So now if I extract like 10 tables for s1 and s2 I need to get the output after executing all the 10 tables and it should not stop whenever a mismatch occurs. So I need to get an output with all the mismatches from all the tables. So all the tables have identical metadata. So now my boss is asking what if I want to execute 20 tables in the database may be some day then he should not create again 20 jobs to execute instead it has to be done in that job. So I was wondering is that possible to create a job? Since if they execute that job asy may be today for 10 tables and tomorrow may be for 20 tables. It has to be executed with the same job only not creating extra 10 tables to execute 20 tables.Well can we get a solution for this. I am not sure how to get the solution for this. As i am new to the Datastage, I was wondering whether we can create a job for this or not or else I was thinking whether the information which I gave is invalid information. Please let me know if the information is not valid or didn't understood what I said?

thanks in advance
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Yikes, that's practically one long run-on sentence. Let me see if a little editing might help...

Somewhat. There's still a lot of "so"ing going on. Not sure what to propose at this time...
-craig

"You can never have too many knives" -- Logan Nine Fingers
rsunny
Participant
Posts: 223
Joined: Sat Jul 03, 2010 10:22 pm

Post by rsunny »

I have one more thing to clarify.every time the new data comes in the production.here my question is whatever the data comes from the source that has to be loaded in the target and now i need to check whether all the records which are in the source have been loaded into the target or not by doing some transformations.but is there any possibility that in production ,once i get the new data and loaded in the target and checks whether the target and source of all the records are matching or not .In the mean while while i am comparing is there any possibility of getting new data again on the source and club with the source data and if i am still comparing the data with the target then obviously it will be a mismatch, i wont ever get a match.so is there any solution for this problem say for suppose if the data comes at 10 a.m and i loaded into the target and i am checking all the records in the target match with the source or not ,while i am comparing if the new data comes at 11a.m so then it might club with the old data ,i think we can load all the new data into a temp file and load it in the target and then we can compare but without loading into temp file can we able to compare..Please dont mistake me if i have wriiten anything wrong.please suggest me if we can do it or not as my lead has asked me to design a job in that way.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

For that, you need to "land" or "stage" your source data, meaning a flat file or staging table, then load from there. That way it becomes static and you can take as long as you like to check whatever needs checking without worrying about what may (or may not) be going on in the true source.
-craig

"You can never have too many knives" -- Logan Nine Fingers
rsunny
Participant
Posts: 223
Joined: Sat Jul 03, 2010 10:22 pm

Post by rsunny »

Thanks for the reply craig.
Post Reply