Page 1 of 1

Data Analysis ???

Posted: Wed Oct 12, 2011 8:50 pm
by Developer9
Hi,

How to perform data validation on fields that are changing weekly ?

I have created delta files (Sequential files)USING " cdc stage" but these are big enough as full size volume (5GB) to cause crash on server

As a Data Stage Developer .what is the scope ???

Posted: Wed Oct 12, 2011 8:56 pm
by ray.wurlod
5GiB isn't a lot, unless your server lacks "large file support". Contact your system administrator.

Posted: Mon Dec 12, 2011 11:58 am
by Developer9
@ray

There is a limitation on server lacks "large file support"

Environment:Data Stage 7.5.2 Parallel Edition

Code: Select all

compare two datasets(Today's records and previous days records) >>>CDC>>>target delta (sequential file)
We are looking to fix this issue by modifying the job job such a way that it splits the target delta(weekly)file into two:

Category 1:File that contain fields that dont change often ..(size is less supposed to be) "

Category 2:File that contain fields that change often (weekly)
(size is larger compare to category 1 files)"

How to modify the job design /code in order to generate (two)separate files for this fields ???

Is there any approach to minimize the file size ???


Thanks for the input