Pass multiple lines or rows as parameter

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
satisht
Participant
Posts: 15
Joined: Tue Nov 07, 2006 4:41 pm

Pass multiple lines or rows as parameter

Post by satisht »

Hi,

We have a requirement where we need to pass value as parameter for the Before/After SQL in Oracle Connector. This value can be a DML statement or a Procedure/Package call and these statements are not limited to one table/procedure. The statement in general contains multiple lines and so i have to make into a single line so that it can be passed as parameter.

Is there a way that we can pass multiple lines/block as parameter.

Thanks in Adavance
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Any form of delimited string.

If you think about it, that's all a file is, with line terminator as the delimiter.

Of course, the Before/After subroutine will need to unpack the delimited string. Even if you persevere with using line terminator as the delimiter.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
satisht
Participant
Posts: 15
Joined: Tue Nov 07, 2006 4:41 pm

Post by satisht »

Thanks for the reply Ray and appreciate the idea of reading from file \n as delimiter.

First of all, let me be more clear on the purpose of this job. Sorry for not being so earlier.

This job will be run on an Adhoc request in Production environment. Most of the times I did not face a problem just had to make the parameter value into a single line easily. But yesterday I had to run 4000 insert statements and these statements were prepared manually. The data is not in any of the table and when I made all those insert statements into a line it has length of 2 million.

First problem was :

When I put that entire insert statements as a single string in the parameter field, a warning message popped up saying that the maximum allowed is 65,519 bytes.

is there an environment variable setting or something that this limit can be increased.

Second thing is

To read from file with \n as delimiter, the file has to be again migrated from DEV to PROD. We had already more than a million records to be processed for that day and the migrations involves Admin team, request forms and approvals.

So I was looking for an option if we can pass multiple lines as a parameter at a time. I am still wondering why IBM has not provided this.

Once again thanks for the reply
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You can, but only up to a limit of about 64KiB, as you have discovered.
There is no environment variable for changing this limit, at least not one of which I'm aware. You might find one from your official support provider.

You may need to contemplate a different approach - perhaps an External Source or Row Generator stage to generate rows which a preliminary job can insert for you.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
kduke
Charter Member
Charter Member
Posts: 5227
Joined: Thu May 29, 2003 9:47 am
Location: Dallas, TX
Contact:

Post by kduke »

There is a job in EtlStats which loads SQL into a hashed file then feeds the parameters in in job control. Works really well. It is a server job but it can feed parameters with linefeeds into PX jobs.
Mamu Kim
ssnegi
Participant
Posts: 138
Joined: Thu Nov 15, 2007 4:17 am
Location: Sydney, Australia

Post by ssnegi »

Write the parameter line to a file param_file.txt and in the before/after sql
invoke unix script InvokeSQL.ksh:
paramVar=`cat param_file.txt`
sqlplus -s userid/password@db @sql_file.sql $paramVar
Post Reply