DS Temp Dir Space issue

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
rprasanna28
Participant
Posts: 10
Joined: Fri May 12, 2006 12:31 am

DS Temp Dir Space issue

Post by rprasanna28 »

I have designed a lookup job. When I try to execute that job Iam getting the following error message

"LKP_AL_YES_COEP,0: Could not map table file "/dsetlsoft/datastage/Ascential/DataStage/Datasets/lookuptable.20070613.ujo03nc (size 1733722784 bytes)": Not enough space

Error finalizing / saving table /tmp/dynLUT6390006fc4e533"


We tried executing the job after increasing the Temp Directory memory but still we are facing the same error.

Could any one suggest me in solving this.

Regards
Prasanna Lakshmi R
balajisr
Charter Member
Charter Member
Posts: 785
Joined: Thu Jul 28, 2005 8:58 am

Post by balajisr »

What is the size of your reference data?

Use join instead of lookup if your reference data is huge.
rprasanna28
Participant
Posts: 10
Joined: Fri May 12, 2006 12:31 am

Post by rprasanna28 »

I HAVE AROUND 33 LAKH RECORDS IN MY REFERENCE TABLE
Hemant_Kulkarni
Premium Member
Premium Member
Posts: 50
Joined: Tue Jan 02, 2007 1:40 am

Post by Hemant_Kulkarni »

If your input data is relatively less, use sparse lookup instead
rameshrr3
Premium Member
Premium Member
Posts: 609
Joined: Mon May 10, 2004 3:32 am
Location: BRENTWOOD, TN

Post by rameshrr3 »

For those not in the 'know'

33 LAKH = 3.3 Million
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

If your source is equally huge, go for a join stage to join the records. If your source is small, go for a sparse lookup.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Post Reply