All objects in the Job become PLUG objects.

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
pankajg
Participant
Posts: 39
Joined: Mon Jun 05, 2006 5:24 am
Location: India

All objects in the Job become PLUG objects.

Post by pankajg »

Hi,

I have come across this particular issue and would like to know if anyone have faced similar issues:

The issue is that all the objects in the job are converted to Plug object. I would need to the reason behind this conversion.

Also I cannot save a job using "Save as" option and also the parallel job gets converted to server jobs.

I have contacted the Data Stage support people and their first investigation points towards maximum disk space, inodes and directory inner project.

Currently the max disk space : ~ 26 GB

The point of concern is that the support suggested the max no of jobs in the project should be around 1000-1200.

Is there some limitation on the maximum No of Jobs in a project ?
What is reason behind the objects getting converted to Plug objects?

Thanks a lot for your time. :)

regards
Pankaj
Failures push you towards Success.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

That normally happens when the client machine is running out of memory. Close some other windows and/or re-boot. In particular, it is NOT a problem on the server. You have sent your DataStage support people on a wild goose chase.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
pankajg
Participant
Posts: 39
Joined: Mon Jun 05, 2006 5:24 am
Location: India

Post by pankajg »

Hi Ray,
Thanks for the reply, though I am not sure if its a client problem, because when I tried opening the same job on other system, I got the same plug object appear again. Incase if it was a client issue I could have still opened the job at different system. Can you please help?
Well the IBM Support came up with limiting the number of jobs in the project to 1000-1200 which I believe is not a good thing .. wonder if this could be argued.

Please help.

Thanks in advance
Regards
Pankaj
Failures push you towards Success.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Might be worth asking them why.

Certainly the client memory gets hit hard when uploading all of the jobs (which happens, for example, for the drop down list in a Job activity in a job sequence or, more generally, when refreshing the Repository view).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ambasta
Participant
Posts: 93
Joined: Thu Jan 19, 2006 10:29 pm
Location: Bangalore

Post by ambasta »

Hi Pankaj,
i also faced the same prob.that time some one suggested me to uninstall the client and again re install.i think this will also do for u???Just give a trial....or else if u got any input from DS guys!!plz let us know.
Thanks.
ambasta
pankajg
Participant
Posts: 39
Joined: Mon Jun 05, 2006 5:24 am
Location: India

Post by pankajg »

Dear all,

When I took the problem of plug object to this is what they came up with.

"My investigation reveals that there was some sort of corruption of the job during a 'rename' or 'save as'. The most likely reason is that /tmp space fill low or one of the important processes died unexpectedly."

And they also informed that we should look at limiting the number of jobs to 1000-1200 per project, which is the main cause of concern.

Now as I understand from one of the posts that I had authored to get the maximum number of jobs in a project, I understand that DS_JOBOBJECTS is that file that stores all the design information about the jobs and if that file is within 2.2 GB on a 32-bit system, then I can easily have in as many jobs. (my current DS_JOBOBJECTS sizing is just 0.2 GB) , now I am counting the number of jobs that I have.

viewtopic.php?p=200980#200980

I am trying to link the number of jobs and the file size, would that be the right thing to do.. any other perspective???

Let me know

Thanks all for you replies
Failures push you towards Success.
pankajg
Participant
Posts: 39
Joined: Mon Jun 05, 2006 5:24 am
Location: India

Post by pankajg »

The stages were converting to plug objects as there is a limit on the max number of folder that can be supported on the server, and we are exceeding the limit.. We faced this issue of Plug object.
Failures push you towards Success.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

AIX?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

On Solaris that limit is 32K subdirectories in a directory. That would allow in excess of 5000 jobs. Do you really have that many?

Code: Select all

SELECT COUNT(*) FROM DS_JOBS WHERE NAME NOT LIKE '\\%';
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

I've seen that limit hit here with just over 3,000 jobs. This is because of all the other type 30 (Dynamic) files created. The temporary solution was to change all LOG and STATUS type hashed files to type 2 - then go about making the project smaller. Too many CopyOfCopyOfCopy... jobs out there.
Post Reply