Datastage Phantom - JOB.1132352373.DT.1471260026.TRANS1

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
suresh.narasimha
Premium Member
Premium Member
Posts: 81
Joined: Mon Nov 21, 2005 4:17 am
Location: Sydney, Australia
Contact:

Datastage Phantom - JOB.1132352373.DT.1471260026.TRANS1

Post by suresh.narasimha »

Hi,

I have a folder stage which reads a set of csv files.

The folder stage is connected to a row splitter.

The row splitter splits the data into to rows, one the header and the other row is the data.

The second column length is given the maximum 9999999 beyong which datastage cannot accept.

when processing the data i got a phantom

DataStage Job 222 Phantom 6467
Program "JOB.1132352373.DT.1471260026.TRANS1": Line 68,
Available memory exceeded. Unable to continue processing record.
DataStage Phantom Finished

Please suggest me some ideas where i went wrong and where can i debug the problem.

Thanks,
Suresh N
SURESH NARASIMHA
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

I believe that "Available memory exceeded." is a very clear message. Think supply and demand.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
suresh.narasimha
Premium Member
Premium Member
Posts: 81
Joined: Mon Nov 21, 2005 4:17 am
Location: Sydney, Australia
Contact:

Post by suresh.narasimha »

Thanks Ray.

Issue is with the memory.
SURESH NARASIMHA
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Rather than the Folder, use a Sequential file stage - either with the Filter option or a before-job concatenation - to read all of the files.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply