Hi All,
I am getting this warning when processing xml file. My xml file is around 300 MB .
XML input document value is empty or NULL. Column Name = "Record"
DataStage Job 30 Phantom 12792
Program "DSP.ActiveRun": Line 51,
Available memory exceeded. Unable to continue processing record.
DataStage Phantom Finished.
Is this warning is due to processing of large xml files? Please advise.
Thansk,
With Regards,
Kumar66
XML Input document value is empty or NULL.
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Hi Kumar66,
Could you provide some additional details about the server configuration?
What is the CPU, RAM, Page File limits et al?
Based on feedback from IBM Support on an issue that we had in an EE job, it was stated that XML Input stage in Datastage uses DOM Object model to load the entire XML into memory before it moves to the next stage. Due to this design XML input stage could fail, depending on the system configuration, from 250 MB or greater size files. One of the suggestions was to have smaller size files and process them from within a folder.
In a parallel job, we saw a "Heap Allocation Failure" Error message.
In fact, a search on "Heap Allocation Failure" will give you more links on this topic.
The folllwing link has some interesting tips (I just searched and found this)
viewtopic.php?t=100959
Thanks
Could you provide some additional details about the server configuration?
What is the CPU, RAM, Page File limits et al?
Based on feedback from IBM Support on an issue that we had in an EE job, it was stated that XML Input stage in Datastage uses DOM Object model to load the entire XML into memory before it moves to the next stage. Due to this design XML input stage could fail, depending on the system configuration, from 250 MB or greater size files. One of the suggestions was to have smaller size files and process them from within a folder.
In a parallel job, we saw a "Heap Allocation Failure" Error message.
In fact, a search on "Heap Allocation Failure" will give you more links on this topic.
The folllwing link has some interesting tips (I just searched and found this)
viewtopic.php?t=100959
Thanks
-V
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Hi Craig, Group,
I should have mentioned this earlier - apologies.
We recently got a new Linux Grid environment and I tried this job over there about 2 weeks ago. The only difference I saw was that it ran for about 4 hours before giving up on memory. In the Windows server I would have a failure within an hour's time. However, I did not monitor the various other resources on the Linux box during/after the failure.
Not being a Linux/UNIX expert, I had found from various posts that "ulimit -a" should give some idea on the resource allocation for a user on the box. Here is what I found on the Linux server.
Not sure this adds value, but thought I would put it here.
Thanks
I should have mentioned this earlier - apologies.
We recently got a new Linux Grid environment and I tried this job over there about 2 weeks ago. The only difference I saw was that it ran for about 4 hours before giving up on memory. In the Windows server I would have a failure within an hour's time. However, I did not monitor the various other resources on the Linux box during/after the failure.
Not being a Linux/UNIX expert, I had found from various posts that "ulimit -a" should give some idea on the resource allocation for a user on the box. Here is what I found on the Linux server.
Code: Select all
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
pending signals (-i) 1024
max locked memory (kbytes, -l) 32
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 114688
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
Thanks
-V