Fatal Error

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Fatal Error

Post by mmanes »

Hi everyone,
I've the following error:

Lookup_188,7: Could not map table file "/dstageeetl3/Dataset/lookuptable.20050510.knizooc (size 88760064 bytes)": Not enough space

Additional Info:
- the filesystem and RAM are not busy at run-time
- in a different machine it runs succesfully by 12 parallel degree
- I'm using in both cases the memory windows
- on the machine where the error is detected, the problem is the same using both 12 and 16 parallel degree.

Can you help me?

Thank you in advance,
Matteo.
Eric
Participant
Posts: 254
Joined: Mon Sep 29, 2003 4:35 am

Post by Eric »

Are the HP-UX kernal parameters configured correctly?
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

Yes Eric,
all environment variables in uvconfig, DSParams and kernel are the same in both machines.
blewip
Participant
Posts: 81
Joined: Wed Nov 10, 2004 10:55 am
Location: London

Post by blewip »

Have you run out of space on a filesystems?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Re: Fatal Error

Post by ArndW »

mmanes wrote:Hi everyone,
I've the following error:

Lookup_188,7: Could not map table file "/dstageeetl3/Dataset/lookuptable.20050510.knizooc (size 88760064 bytes)": Not enough space

Additional Info:
- the filesystem and RAM are not busy at run-time
- in a different machine it runs succesfully by 12 parallel degree
- I'm using in both cases the memory windows
- on the machine where the error is detected, the problem is the same using both 12 and 16 parallel degree.

Can you help me?

Thank you in advance,
Matteo.
Matteo, do a "df -k /dstageeetl3/Dataset/"; it would seem that your filesystem might be full
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

blewip,
the filesystem is not full.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Mateo,

does that filesystem have more than 88Mb free space? If not, you have found your cause.
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

Filesystem and RAM are monitored by me during execution of the job and both were free.
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

The free sace of that filesystem is 300 GB min at run-time.
T42
Participant
Posts: 499
Joined: Thu Nov 11, 2004 6:45 pm

Post by T42 »

ulimit -a. Verify that you (and dsadm) have the ability to create files that large.
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

> ulimit -a
time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1991680
stack(kbytes) 80896
memory(kbytes) unlimited
coredump(blocks) 4194303
T42
Participant
Posts: 499
Joined: Thu Nov 11, 2004 6:45 pm

Post by T42 »

Try it within DataStage. Do an ExecSH with that command.
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

That command were executed by the same user of the jobs
T42
Participant
Posts: 499
Joined: Thu Nov 11, 2004 6:45 pm

Post by T42 »

Please provide the result from within the DataStage job, as I asked. The results CAN be different, despite using the same username to run jobs.
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

The execution of ulimit -a from DataStage is:

time(seconds) unlimited
file(blocks) unlimited
data(kbytes) 1991680
stack(kbytes) 80896
memory(kbytes) unlimited
coredump(blocks) 4194303
nofiles (descriptors) 1024

Have u some ideas?

Thank u in advance.
Post Reply