Data Analysis ???

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
Developer9
Premium Member
Premium Member
Posts: 187
Joined: Thu Apr 14, 2011 5:10 pm

Data Analysis ???

Post by Developer9 »

Hi,

How to perform data validation on fields that are changing weekly ?

I have created delta files (Sequential files)USING " cdc stage" but these are big enough as full size volume (5GB) to cause crash on server

As a Data Stage Developer .what is the scope ???
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

5GiB isn't a lot, unless your server lacks "large file support". Contact your system administrator.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Developer9
Premium Member
Premium Member
Posts: 187
Joined: Thu Apr 14, 2011 5:10 pm

Post by Developer9 »

@ray

There is a limitation on server lacks "large file support"

Environment:Data Stage 7.5.2 Parallel Edition

Code: Select all

compare two datasets(Today's records and previous days records) >>>CDC>>>target delta (sequential file)
We are looking to fix this issue by modifying the job job such a way that it splits the target delta(weekly)file into two:

Category 1:File that contain fields that dont change often ..(size is less supposed to be) "

Category 2:File that contain fields that change often (weekly)
(size is larger compare to category 1 files)"

How to modify the job design /code in order to generate (two)separate files for this fields ???

Is there any approach to minimize the file size ???


Thanks for the input
Last edited by Developer9 on Thu Dec 15, 2011 2:20 pm, edited 1 time in total.
Post Reply