Page 1 of 1

changes in source

Posted: Wed Nov 26, 2003 9:40 am
by nag0143
We are doing Data migration from legacy systems to SAP

We are using Ascential datastage and Quality stage to migrate

Whatever changes (inserts or deletes ) that take place in the source(Legacy systems) will take effect in the datastage server ...
For address cleansing I am copying the file from datastage server once and i am doing cleansing on that file..... And i know this is not dynamic.

My question is Is there any way whatever changes that takes place in the Legacy systems(orginal source) have to effect my Quality stage(integrity 6.0.1) job .. can anyone suggest if there is any way to do this.

Posted: Wed Nov 26, 2003 9:53 am
by kcbland
We need to know what kind of legacy system...

It is mainframe, relational database, whatever. There are techniques and solutions for doing this, but you need to give more information.

Posted: Wed Nov 26, 2003 10:13 am
by nag0143
We are using Mainframe systems and i connected to that server using datastage, and brought the data into datastage server, after that i copied the file onto integrity server and performing cleansing...As i need only address information for cleansing i only did bring (copied) those field on to integrity server..

What will happen if i there are any changes in the mainframe system....
i know these changes takes effect in datastage server but how can these effect the file in integrity server......

Posted: Wed Nov 26, 2003 10:29 am
by kcbland
Are you using DS390, or by connecting do you mean it is DB2 and you're using DB2 plugin or ODBC? If you are able to do relational queries, do you have any columns of data that indicate when the row was inserted/last updated?

What is the volume of the source table? Is it extremely large? Could you possibly extract the entire table sorted to a file, compare that to the previous run using "diff", popping out the delta rows? What about a transaction log?

I asked for specific information, please provide.

Posted: Wed Nov 26, 2003 10:44 am
by nag0143
no no ... all that we are using are flat files we are getting the data from main frames by FTP on to Data stage server and connected to those files using CFF stage and getting into Fixed format files......

and after that i copied fixed format file on to Integrity server and performing investigation,conditioning and standardization,matching

Posted: Wed Nov 26, 2003 10:51 am
by nag0143
hey,
Whatever changes that takes place for a file in a Datastage server doesn't effect the same file integrity server because i am just copying the file from datastage server on to my desktop and then copying that file on to integrity server and i mean is this the only way you do...if it is only way i think have to copy data from ds server daily and restructure my job in the integrity.......

Posted: Wed Nov 26, 2003 4:51 pm
by vmcburney
If you have your QualityStage job as a plugin in a DataStage job then you do not need to move the file to your QualityStage server, DataStage will pass the data to the plugin row by row, integrity will process the data and pass it back to DataStage. The fact that they are on two seperate servers will become irrelevent.

You can write an ftp script that takes files in the legacy directory and copies them to DataStage. You can create a DataStage sequence job that calls this ftp script, you can create a routine in this job that sleeps and loops and keeps looking for new files to ftp and process, or schedule it to ftp and process once a day.

Posted: Wed Nov 26, 2003 5:48 pm
by kcbland
I don't know what release of DS or platform, so I'll assume Unix.

Well, how is the file being generated from the mainframe? Is it a complete table dump or a delta file itself? If you're getting a whole table dump each day, then all you have to do is sort it and then use the diff statement to compare today's sorted file with yesterday's sorted file. diff will pop out new or different rows. Voila! A delta file.