DMEMOFF, PMEMOFF, LDR_CNTRL
Posted: Mon Jan 14, 2008 7:59 pm
Previously I had a job that keeps aborting with the error message:
href_StgExtIdSiMap,0: Could not map table file "/home/dsadm/Ascential/DataStage/Datasets/lookuptable.20080109.ymetadc (size 1133689184 bytes)": Not enough space
Error finalizing / saving table /entis/IMS/hash/lkpfsPStgExtIdSiMap_bkp
This job basically consists of DB2 -> Transformer -> Lookup File Set.
What it does is loading 10 mil ++ data into the lookup file set.
I've contacted IBM and was told that due to my lookup file set partition set as 'Entire', this caused the server to run out of 'contiguous memory blocks'.
Their solution was for us to modify the:
1) Change DMEMOFF to 0x90000000, Change PMEMOFF to 0xa0000000 (from uvconfig file)
2) And also LDR_CNTRL=MAXDATA=0x80000000 (From dsenv file)
At the moment my ds admin has modified item 1, and the job does not seem to abort anymore once I used hashed partition on the lookup file set.
I was wondering what roles the above play in helping to solve the abortion issue?
href_StgExtIdSiMap,0: Could not map table file "/home/dsadm/Ascential/DataStage/Datasets/lookuptable.20080109.ymetadc (size 1133689184 bytes)": Not enough space
Error finalizing / saving table /entis/IMS/hash/lkpfsPStgExtIdSiMap_bkp
This job basically consists of DB2 -> Transformer -> Lookup File Set.
What it does is loading 10 mil ++ data into the lookup file set.
I've contacted IBM and was told that due to my lookup file set partition set as 'Entire', this caused the server to run out of 'contiguous memory blocks'.
Their solution was for us to modify the:
1) Change DMEMOFF to 0x90000000, Change PMEMOFF to 0xa0000000 (from uvconfig file)
2) And also LDR_CNTRL=MAXDATA=0x80000000 (From dsenv file)
At the moment my ds admin has modified item 1, and the job does not seem to abort anymore once I used hashed partition on the lookup file set.
I was wondering what roles the above play in helping to solve the abortion issue?