Page 1 of 1

Nbr of DS projects

Posted: Mon Mar 06, 2006 3:52 pm
by rickrambo
Hi,

I would like to know the advantages and disadvantages in limiting the number of projects one can have on DS. Howmany projects do you have on your systems?

Is it inefficient to have toomany ( toomany = >15 - 20) projects, will this cause unnecessary memory usage as system has to manage more projects?

To give you an idea of the hardware configuration... we have 2 AIX servers with 10 cpu on each box. Each cpu has 2GB memory.

If I have a project for each application area ( or one for each datamart), it will be easy to control the access and allow only one group access their respective project, but if this causes overhead in memory and other resources, I don't want to do that.

Thanks

Posted: Mon Mar 06, 2006 4:14 pm
by kumar_s
There is not numbers given for maximum limit for projects. Still it may not be too effecient to have more number of projects in a machine.
If you have a set of code which likely to be repeated for all the projects, it is better you dont create a new project. Try to manage within a project. Projects is always a logical partition of jobs. It can be even simple as
1. Develpment
2. Testing
2.a. Unit Test (Mostly Done in Development)
2.b. SIT
2.c. UAT
3. Production.
4. (A protected projects to maintain version Controls).
At the same time is also advisable to make partitions if the nubmber of jobs goes beyond a range. This is for good maintalability.

Posted: Mon Mar 06, 2006 4:25 pm
by kcbland
The number of projects has NO bearing on cpu or memory usage. It is simply a "foldering" convention, like a database schema. In fact, a project is actually called an "account" in the DS Engine language, and an account is perfectly synonymous with a schema.

Posted: Mon Mar 06, 2006 4:42 pm
by kumar_s
Though project doesnt takes up the memory directly, the project folder where all the objects reside (RT_STATUnn,RT_SCnn,RT_LOGnn,RT_CONFIGnn,RT_BPnn,RT_BPnn.O,DS_TEMPnn) may require memory. As i mentioned, if most of the jobs and logics and category are about to be replicated, it is again replication of all these objects in the project folder. But still altogether it may not be considrable amount of memory when compared to the volume consumed by the data used. :wink:

Posted: Mon Mar 06, 2006 4:49 pm
by kcbland
Projects, logs, status files, none of that matters. There's NO ram or cpu difference between 1 project of 500 jobs or 10 projects of 50 jobs. Disk consumption, yes, because of the overhead of the base files that make up a project.

Posted: Mon Mar 06, 2006 5:07 pm
by kwwilliams
We went the route of one big project. Which as stated above has 0 bearing on job performance. Your server(s) are going to be capable of a certain workload and the number of projects that you have does not have an impact. What I don't like about it is the amount of time that Director takes to refresh. I may be wrong, but I believe that Director refreshes all of the objects in the entire project when it refreshes the data. in a large projects with 500-600 jobs this takes a while. I have a smaller project that did not fit ito the scope of the all ready established and very large project. The refresh feels almost instantaneous in comparison.

Posted: Mon Mar 06, 2006 5:18 pm
by rasi
Williams

The refresh interval can be changed in the director. The maximum value it can hold is 600 seconds or you can turn it off and do it manually whenever you want to refresh the director.

In earlier version when a project had too many folders and jobs had this problem in director. But in latest version I am using project which has more than 1000 jobs which doesn't take long to refresh.

Posted: Tue Mar 07, 2006 2:36 am
by jasper
The most important reason for us to limit the number of projects are commonly used routines. If you write a routine that is used in multiple projects you have to copy it to all projects, which can become problematic if you have 50 projects.

Posted: Tue Mar 07, 2006 8:12 am
by kumar_s
kwwilliams wrote:We went the route of one big project. Which as stated above has 0 bearing on job performance. Your server(s) are going to be capable of a certain workload and the number of projects that you have does not have an impact. What I don't like about it is the amount of time that Director takes to refresh. I may be wrong, but I believe that Director refreshes all of the objects in the entire project when it refreshes the data. in a large projects with 500-600 jobs this takes a while. I have a smaller project that did not fit ito the scope of the all ready established and very large project. The refresh feels almost instantaneous in comparison.
Yes it is true that, the refresh time differes with the number of jobs in the projects (Though interval has been increased).

Posted: Tue Mar 07, 2006 8:19 am
by chulett
I beg to differ - in my experience it only 'refreshes' the jobs in your current category.

Posted: Tue Mar 07, 2006 8:26 am
by ray.wurlod
Never switched off "View Category" in Director?

Posted: Tue Mar 07, 2006 8:40 am
by chulett
Sure... so? Not like you'd run like that over the course of a normal day. And I only do that when I've got a specific filter enabled, otherwise you are asking for a world of hurt... especially pre 7.5 - at least you can bump the refresh interval up to a reasonable level or completely disable it now.