Page 1 of 1

All objects in the Job become PLUG objects.

Posted: Sat Oct 28, 2006 8:39 pm
by pankajg
Hi,

I have come across this particular issue and would like to know if anyone have faced similar issues:

The issue is that all the objects in the job are converted to Plug object. I would need to the reason behind this conversion.

Also I cannot save a job using "Save as" option and also the parallel job gets converted to server jobs.

I have contacted the Data Stage support people and their first investigation points towards maximum disk space, inodes and directory inner project.

Currently the max disk space : ~ 26 GB

The point of concern is that the support suggested the max no of jobs in the project should be around 1000-1200.

Is there some limitation on the maximum No of Jobs in a project ?
What is reason behind the objects getting converted to Plug objects?

Thanks a lot for your time. :)

regards
Pankaj

Posted: Sat Oct 28, 2006 9:59 pm
by ray.wurlod
That normally happens when the client machine is running out of memory. Close some other windows and/or re-boot. In particular, it is NOT a problem on the server. You have sent your DataStage support people on a wild goose chase.

Posted: Sun Oct 29, 2006 11:03 pm
by pankajg
Hi Ray,
Thanks for the reply, though I am not sure if its a client problem, because when I tried opening the same job on other system, I got the same plug object appear again. Incase if it was a client issue I could have still opened the job at different system. Can you please help?
Well the IBM Support came up with limiting the number of jobs in the project to 1000-1200 which I believe is not a good thing .. wonder if this could be argued.

Please help.

Thanks in advance
Regards
Pankaj

Posted: Mon Oct 30, 2006 7:37 am
by ray.wurlod
Might be worth asking them why.

Certainly the client memory gets hit hard when uploading all of the jobs (which happens, for example, for the drop down list in a Job activity in a job sequence or, more generally, when refreshing the Repository view).

Posted: Wed Nov 01, 2006 7:36 am
by ambasta
Hi Pankaj,
i also faced the same prob.that time some one suggested me to uninstall the client and again re install.i think this will also do for u???Just give a trial....or else if u got any input from DS guys!!plz let us know.
Thanks.

Posted: Wed Nov 01, 2006 9:28 pm
by pankajg
Dear all,

When I took the problem of plug object to this is what they came up with.

"My investigation reveals that there was some sort of corruption of the job during a 'rename' or 'save as'. The most likely reason is that /tmp space fill low or one of the important processes died unexpectedly."

And they also informed that we should look at limiting the number of jobs to 1000-1200 per project, which is the main cause of concern.

Now as I understand from one of the posts that I had authored to get the maximum number of jobs in a project, I understand that DS_JOBOBJECTS is that file that stores all the design information about the jobs and if that file is within 2.2 GB on a 32-bit system, then I can easily have in as many jobs. (my current DS_JOBOBJECTS sizing is just 0.2 GB) , now I am counting the number of jobs that I have.

viewtopic.php?p=200980#200980

I am trying to link the number of jobs and the file size, would that be the right thing to do.. any other perspective???

Let me know

Thanks all for you replies

Posted: Thu Nov 16, 2006 11:29 pm
by pankajg
The stages were converting to plug objects as there is a limit on the max number of folder that can be supported on the server, and we are exceeding the limit.. We faced this issue of Plug object.

Posted: Fri Nov 17, 2006 12:20 am
by chulett
AIX?

Posted: Fri Nov 17, 2006 7:46 am
by ray.wurlod
On Solaris that limit is 32K subdirectories in a directory. That would allow in excess of 5000 jobs. Do you really have that many?

Code: Select all

SELECT COUNT(*) FROM DS_JOBS WHERE NAME NOT LIKE '\\%';

Posted: Fri Nov 17, 2006 8:29 am
by ArndW
I've seen that limit hit here with just over 3,000 jobs. This is because of all the other type 30 (Dynamic) files created. The temporary solution was to change all LOG and STATUS type hashed files to type 2 - then go about making the project smaller. Too many CopyOfCopyOfCopy... jobs out there.