Page 1 of 1

Use of Server routines in Parallel Jobs

Posted: Mon Jun 18, 2007 8:11 am
by vnspn
Hi,

We are currently using some Server Routines in our Server Jobs. We would be shortly upgrading our DataStage server to an Enterprise Edition. We would like to know whether we can use our current Server Routines in our new Parallel Jobs that we would be developing.

We are currently using some BASIC fuctions like "DSGetLinkMetaData" in our Server routines. Are use of these kinds of functions valid in a Parallel Job / Enterprise Edition?

Thanks.

Posted: Mon Jun 18, 2007 8:22 am
by DSguru2B
You can use these BASIC routines in a basic transformer in a parallel job.

Posted: Mon Jun 18, 2007 9:05 am
by vnspn
DSguru2B, thanks for your reply.

But the DS engine in Enterprise Edition is orchestrate unlike its Universe in Server Edition. So, does it mean that orchestrate engine too supports BASIC functions and routines?

Posted: Mon Jun 18, 2007 9:16 am
by DSguru2B
An enterprise edition will also let you develop server jobs so the engine is still there. That is why it allows you to use a basic transformer in px jobs, provided you are on an smp environment.

Posted: Mon Jun 18, 2007 9:45 am
by vnspn
So, if we are on a MPP environment, then wouldn't we be able to use Server routines at all in Parallel jobs?

Posted: Mon Jun 18, 2007 9:54 am
by DSguru2B
No. Because you cannot use a basic transformer then.

Posted: Mon Jun 18, 2007 10:19 am
by vnspn
Ok, thanks a lot DSguru2B.

So, if we are going to be on a MPP environment, then would need to write an equivalent code of Server routine into Parallel routine (in C / C++), right?

Again in that case, currently we are using BASIC functions like "DSGetLinkMetaData" in our Server routines. So, is it that these kinds of functionalities can never be done on a Parallel routine (or on a MPP environment)?

Posted: Mon Jun 18, 2007 10:28 am
by DSguru2B
Yes. The best way is to convert your code to C++ code.
These function exist in C++ versions as well. Look at the Parallel Advanced Developer's guide.

Re: Use of Server routines in Parallel Jobs

Posted: Mon Jun 18, 2007 10:37 am
by Ed Purcell
If you are gathering information about a parallel job after it has run, you can invoke a BASIC routine in 7.x as an After Job Routine in a parallel job, because once the job is finished it is no longer running in parallel. That After Job Routine still has access to the job's handle, so it can proceed to gather the information you want, even though it is no longer running in parallel.

vnspn wrote:Hi,

We are currently using some Server Routines in our Server Jobs. We would be shortly upgrading our DataStage server to an Enterprise Edition. We would like to know whether we can use our current Server Routines in our new Parallel Jobs that we would be developing.

We are currently using some BASIC fuctions like "DSGetLinkMetaData" in our Server routines. Are use of these kinds of functions valid in a Parallel Job / Enterprise Edition?

Thanks.

Posted: Mon Jun 18, 2007 2:18 pm
by vnspn
Thank you DSguru2B and Ed!

Posted: Mon Jun 18, 2007 5:19 pm
by sanjay
Hi All

Query why we can't use basic transformer in MPP setup .

Thanks
Sanjay
vnspn wrote:Thank you DSguru2B and Ed!

Posted: Mon Jun 18, 2007 7:20 pm
by ray.wurlod
You an use server routines in an MPP setup, provided you license DataStage server on every machine in the cluster or grid. Your account rep will be able to retire on the commission!

Server routines require the DataStage Engine to execute. This only exists on the machine where the DataStage server is installed.

Posted: Tue Jun 19, 2007 6:57 am
by vnspn
Ray,

So, as per what you say, is it that in a cluster or grid environment, the DataStage server is installed in one machine. Then, what about the rest of the machines in the cluster? Is it configured just to make use of their hardware?

Posted: Tue Jun 19, 2007 3:08 pm
by ray.wurlod
Pretty much the case - you do have to make sure that a small number of components are deployed to the other machines, perhaps by mounting disk, perhaps by other means. Details are in the manuals. But only the parallel framework supports this capability; server jobs expect to run on a server, hence the name.