unable to start teradata multiload

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
v2kmadhav
Premium Member
Premium Member
Posts: 78
Joined: Fri May 26, 2006 7:31 am
Location: London

unable to start teradata multiload

Post by v2kmadhav »

Hello guys

i have a set of env variables that i have specified for teradata.

when i passthem on as parameters into a EE teradata stage they are picked up perfectly but when i create another job and try to load data from a dataset into a multiload stage. it doesnt seem to be taking in the parameter instead it take DSCAPIOP_TdServer1.

the same job works when i call the same env prameters in a server job and load the same process from a seq file into mulitload. the problem is when i run a parallel job using multiload.

i even tried reading the env variable using a sequencer and passing them into the parallel job as normal parameters. but still no luck. the whole point for having env set up is because i want to pass them as $PROJDEF and change those parameters at one point everytime i wanna make changes.

i think this is a common problem that exists on several postings but i couldnt find a possible solution for it anywhere.
this is what the multiload log says.

************************************
0001 .logtable LOGAC_TEST2;
0002 .logon #DSCAPIOP__TdServer1#/#DSCAPIOP__TdBatchUser1#,;
**** 11:34:33 UTY1006 CLI error: 303, CLI2: BADLOGON(303): Invalid logon string.
**** 11:34:33 UTY2410 Total processor time used = '0.02 Seconds'
. Start : 11:34:33 - FRI SEP 21, 2007
. End : 11:34:33 - FRI SEP 21, 2007
. Highest return code encountered = '12'.


any help would be greatly appreciated please.
thanks
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Welcome aboard.

Can you please describe where you are using the $PROJDEF token?

It should be used only in the job parameters grid, and only as the default value for a job parameter added as an envionment variable. Specifically, you do not set the default value of the environment variable to $PROJDEF. Can you please verify that this is how your design looks?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
v2kmadhav
Premium Member
Premium Member
Posts: 78
Joined: Fri May 26, 2006 7:31 am
Location: London

teradata issue

Post by v2kmadhav »

Yes my main intention is to pass $PROJDEF as a parameter that brings in my default value.... but the intial issue is that the multiload stage doesnt seem to be understand what the $ stands for....
when i pass it on in the same way in a server job to a multiload it understands it. may be internal coding discrepancies.

NOW: i am calling the same job from a sequencer and passing on those values to a differently named parameters of the job and the parallel job using multiload works perfectly.

i just would like to know if this a bug and if this is the only possible work around or if some one can suggest something else where i need to establish a relation ship between server and parallel coding.

i had a similar issue just a while ago:
eg: i had a job control which has a default sql and passes it on to a server job as parameters. when i made the same job control pass on this code as parameter to a parallel job it never picked up 'yyyy-mm-dd' for date format strings but the job was having issues with the sql where it was reading only yyyy-mm-dd. then i used a escape character \' yyyy-mm-dd\' in front of the single quotes and they started to work.

so i guess i am missing something basic but not sure how to come to conclusion when using server and parallel logics together.

your help would be greatly appreciated.
thanks in advance
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You are missing the point. You can not pass $PROJDEF as a job parameter. This is a special token (as is $ENV or $UNSET) that can be used as the default value for otherwise-named environment variable job parameters.

For example, you can add the environment variable value $APT_CONFIG_FILE as a job parameter, and set its default value to $PROJDEF, which means that, unless overridden at run time, the environment variable will use the value set for it in the Administrator client for the project in question.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
v2kmadhav
Premium Member
Premium Member
Posts: 78
Joined: Fri May 26, 2006 7:31 am
Location: London

Post by v2kmadhav »

hey Ray

But when i pass the same parameter to the same job replacing multiload with EE stage and specify $PROJDEF during runtime it works fine.

when i create a server job that writes an empty file into a table using multiload and pass the $PROJDEF at runtime it works fine.

the problem is only when i pass the value to a multiload stage in a parallel job.

now i am passing the parameter from a sequener the same $PROJDEF during runtime to the job within and that works too.

my client is not happy with my work around and wants me to find a solution to pass those parameters directly.

apart from all this the only reason we are doing this is because we want one point of control for the parameters within all jobs. suggest me any other idea if thats more easier to implement.

please help me.
thanks.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

v2kmadhav wrote:the problem is only when i pass the value to a multiload stage in a parallel job.
If you believe you've found a bug, the only way to A) truly verify that and B) get a proper fix is to C) report it to your official Support provider.
-craig

"You can never have too many knives" -- Logan Nine Fingers
JoshGeorge
Participant
Posts: 612
Joined: Thu May 03, 2007 4:59 am
Location: Melbourne

Post by JoshGeorge »

It is a potential bug and I believe there is a fix available from the service provider for this. I had a similar issue in one of my previous projects and IBM responded with a fix. But I found the work around - Passing the parameter from a sequence - elegant and efficient for the requirement you have stated. Passing all the parameters from a main sequence will give you the one point control one level down from the Global environment control in DataStage administrator. From the maintenance/re-design point, say if tomorrow requirement changes to pass few or all parameters not as Global parameters from DSParams, your re-work will drastically reduce, just have to make the changes in the main sequence.
v2kmadhav wrote:now i am passing the parameter from a sequener the same $PROJDEF during runtime to the job within and that works too.

apart from all this the only reason we are doing this is because we want one point of control for the parameters within all jobs.


You had quoted entry inside an outer quote wrap rite? Then escape character is the way.
i had a job control which has a default sql and passes it on to a server job as parameters. when i made the same job control pass on this code as parameter to a parallel job it never picked up 'yyyy-mm-dd' for date format strings but the job was having issues with the sql where it was reading only yyyy-mm-dd. then i used a escape character \' yyyy-mm-dd\' in front of the single quotes and they started to work.
Joshy George
<a href="http://www.linkedin.com/in/joshygeorge1" ><img src="http://www.linkedin.com/img/webpromo/bt ... _80x15.gif" width="80" height="15" border="0"></a>
v2kmadhav
Premium Member
Premium Member
Posts: 78
Joined: Fri May 26, 2006 7:31 am
Location: London

Post by v2kmadhav »

Guys ...

thanks for your support....i am going to raise a call with IBM on this and hopefully that is a bug which we can find a patch for.... for now i am using the sequencer. would post the response once IBM suggests me an answer for it so that it would be helpful for others.

untill then...thanks again.
cheers
Post Reply