Job Import Issue

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

the (d) limit would preclude you from doing a "mkdir {dir}", can you try that? The limits I have seen are ~32766 subdirectories in a directory, I think that was on AIX.

If you attach to the directory and do a "df -F ufs -o i" to see if there are inodes left on the device.
ashik_punar
Premium Member
Premium Member
Posts: 71
Joined: Mon Nov 13, 2006 12:40 am

Post by ashik_punar »

Hi ArndW,
Thank You for the response. I tried to create a dummy directory in the project directory and got the below mentioned message:

mkdir: 0653-358 Cannot create dummy.
dummy: There are too many links to a file.

We are using AIX version 5. Unfortunately i was not able to run your below command as i think this is a solaris command:
df -F ufs -o i
While trying to run the same i got the below mentioned error:

df: Not a recognized flag: F
Usage: df [-P] | [-IMitv] [-gkm] [-s] [filesystem ...] [file ...]

My datastage project is in the FileSystem '/dev/datastage' so i tried to check the inodes for this File system and my command gave me the below mentioned output:

Filesystem 512-blocks Free %Used Iused %Iused Mounted on
/dev/datastage 23068672 8913352 62% 328747 12% /usr/datastage

Please guide me in resolving the issue as i m still not able to Import the jobs. Please guide me if there is some way we can increase the inode upper limit.

Thanking You for all the help.

Punar Deep Singh
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

On AIX the command "df k ." will show inodes used and available.

But Ray was correct - you have probably run into an OS limitation on the number of subdirectories in a project. The quick way to get around this is to delete jobs that you don't need.

If you need all the jobs in the project, then a temporary workaround is to change a number of hashed-files that are currently DYNAMIC into static hashed files. The difference between the two is that dynamic files are stored as subdirectories while static files are saved as actual UNIX files.

The best solution is to redues the number of jobs - either through deletion or by creating a new project and copying some of your jobs there (ds-export & import)
ashik_punar
Premium Member
Premium Member
Posts: 71
Joined: Mon Nov 13, 2006 12:40 am

Post by ashik_punar »

Hi ArndW,
Thank you for the help. I agree that we have hit the OS limitation.
Our Unix admin is trying to resolve this issue by taking some other steps. I will update you once he is done with the same and if the issue gets resolved.
Thank you once again.
Punar Deep Singh
ashik_punar
Premium Member
Premium Member
Posts: 71
Joined: Mon Nov 13, 2006 12:40 am

Post by ashik_punar »

Hi ArndW,
Our UNIX admin was not able to provide any other workaround for this issue. So, we will be going for creation of new projects as we can not delete any of the existing jobs. Thank you for the help that you have extended.

With Regards,
Punar Deep Singh
ashik_punar
Premium Member
Premium Member
Posts: 71
Joined: Mon Nov 13, 2006 12:40 am

Post by ashik_punar »

Hi ArndW,
Our UNIX admin was not able to provide any other workaround for this issue. So, we will be going for creation of new projects as we can not delete any of the existing jobs. Thank you for the help that you have extended.

With Regards,
Punar Deep Singh
Post Reply