Fatal Error

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

mmanes - if it's any consolation, I am getting the same problem now. My disk isn't full and it seems that the error message happens when the file gets just a bit over 1Gb large. I put a display of "ulimit -a" into the job and it shows unlimited and I wrote another test Px job that created a sequential file of 2Gb with no problems.

I think it might be something to do with doing a DataSet -> Lookup stage directly, but can't get rid of the error!
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

I've solved the problem...

You're usin memory windows?
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

No, I am having this problem on an AIX box. What is/was the cause - I haven't solved my issues yet.
elavenil
Premium Member
Premium Member
Posts: 467
Joined: Thu Jan 31, 2002 10:20 pm
Location: Singapore

Post by elavenil »

Can you explain when you are getting this error and what the job design is.

If you use Dataset as a lookup and PX engine will create lookup table while running a job so this error is encountered if the lookup table which you are trying to create exceeds 2GB in size (even though the file size is unlimited for the user).

Check the lookup dataset and use join instead of lookup if you use dataset as a lookup.

Hope TWH.

Regards
Saravanan
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

It's my understanding that a persistent DataSet can exist as multiple files, each of which isn't bigger than 2GB. Are you saying that this is not the case for virtual Data Sets?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
elavenil
Premium Member
Premium Member
Posts: 467
Joined: Thu Jan 31, 2002 10:20 pm
Location: Singapore

Post by elavenil »

Hi Ray,

When Dataset is created, it is created multiple disks/single disk based on the nodes that are mentioned in the configuration file.

When the dataset is used as a lookup (instead of lookup file set), PX engine will create lookup table the lookup table exceeds the size that is mentioned for the user, this problem occurred. I had the same problem in my previous project and resolved using join (instead of lookup) to join input and lookup datasets.

Hope this would clarify the question.

Thanks & Regards
saravanan
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

elavenil,

I don't know if this is my problem, although the description does make sense. In my case the lookup file set is just over 1Gb, but my ulimit is large enough... I've written some test jobs to ensure that my files and datasets can be larger and they work. I will start analyzing the error later on today and hope to find the cause (and the solution) by then.
mmanes
Participant
Posts: 91
Joined: Tue Mar 16, 2004 10:20 am
Location: Rome

Post by mmanes »

This solution is for HP.

For more info see my post at viewtopic.php?t=92775&highlight=

In that configuration u may add the virtual nodes <hostname>, <host1>,... in /etc/hosts

By,
Matteo.
Post Reply