Hi,
How to perform data validation on fields that are changing weekly ?
I have created delta files (Sequential files)USING " cdc stage" but these are big enough as full size volume (5GB) to cause crash on server
As a Data Stage Developer .what is the scope ???
Data Analysis ???
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Premium Member
- Posts: 187
- Joined: Thu Apr 14, 2011 5:10 pm
@ray
There is a limitation on server lacks "large file support"
Environment:Data Stage 7.5.2 Parallel Edition
We are looking to fix this issue by modifying the job job such a way that it splits the target delta(weekly)file into two:
Category 1:File that contain fields that dont change often ..(size is less supposed to be) "
Category 2:File that contain fields that change often (weekly)
(size is larger compare to category 1 files)"
How to modify the job design /code in order to generate (two)separate files for this fields ???
Is there any approach to minimize the file size ???
Thanks for the input
There is a limitation on server lacks "large file support"
Environment:Data Stage 7.5.2 Parallel Edition
Code: Select all
compare two datasets(Today's records and previous days records) >>>CDC>>>target delta (sequential file)
Category 1:File that contain fields that dont change often ..(size is less supposed to be) "
Category 2:File that contain fields that change often (weekly)
(size is larger compare to category 1 files)"
How to modify the job design /code in order to generate (two)separate files for this fields ???
Is there any approach to minimize the file size ???
Thanks for the input
Last edited by Developer9 on Thu Dec 15, 2011 2:20 pm, edited 1 time in total.