Page 1 of 1

Number of lines restriction

Posted: Mon Feb 19, 2007 9:28 am
by Mun
Hi, is there a number of lines restriction for Before/After SQL or User-defined SQL? I have a 600 lines SQL that I need to put it in either Before, After or user-defined SQL section. The reason why I need to put it in an ETL is because we cannot promote an SQL Script (political issues).

When I put the 600s lines of SQL in the Before/After, only half would fit and the other half would get chopped off.

What other approaches can I use?

Posted: Mon Feb 19, 2007 11:49 am
by Krazykoolrohit
600 lines of SQL.
wonder why they cant be implmented using a job

Posted: Mon Feb 19, 2007 11:56 am
by Mun
Krazykoolrohit wrote:600 lines of SQL.
wonder why they cant be implmented using a job
---

Hi, what do you mean by implemented using a job?

Posted: Mon Feb 19, 2007 12:00 pm
by Krazykoolrohit
what does your SQL is trying to do?

datastage jobs are made to ease out the SQLs into seperate data flow. if you look at it crudely, everything that can be done in datastage can be done in a SQL, hence we dont need datastage.

but the crux is that datastage does it better, faster, easier and with optimum results. thats why we do it in datastage.

so you can join two tables, or can aggregate, basically anything a SQL can do.

Posted: Mon Feb 19, 2007 12:30 pm
by DSguru2B
You can also use Before Job ExecDos, OS command to invoke the sql utility to connect to your database and execute your script. That is another way too.

Posted: Mon Feb 19, 2007 12:32 pm
by narasimha
Everything has to have a limit, looks like you have reached the limit on the number of characters allowed in that particular stage.
You could have wrapped the sql scripts into a file and run it as an after/before script, but it is ruled out in your case, right?
Is loading the scrips into a file and use "Load sql from a file at run time" an option?
Can you provide details as to what database/stage you want to use this sql scripts in?
What are you doing in these sql statements?
May be you can implement the same in the job.

Posted: Mon Feb 19, 2007 12:50 pm
by Mun
narasimha wrote:Everything has to have a limit, looks like you have reached the limit on the number of characters allowed in that particular stage.
You could have wrapped the sql scripts into a file and run it as an after/before script, but it is ruled out in your case, right?
Is loading the scrips into a file and use "Load sql from a file at run time" an option?
Can you provide details as to what database/stage you want to use this sql scripts in?
What are you doing in these sql statements?
May be you can implement the same in the job.
----------------------------------------------------------------------------


There are about 500 lines of insert statement. Business user created this SQL script file and ran successfully. They want to promote this script file into production. Admin created a rule to not allow SQL script, so the quickest way to get this script into production is to put it in an ETL and promote it. This script will create a table that the cube developer will be able to use this without much modification on his side.

I hope I am able to explain this a little clearer.

Posted: Mon Feb 19, 2007 12:56 pm
by narasimha
Extract the data from your insert script and put them in a file.
Design a job to read this file as an input and insert into what ever database you want!

Posted: Mon Feb 19, 2007 1:32 pm
by narasimha
Ok, if your issue is that you cannot move a file into production, break your insert script into two, and run your job twice.
If breaking the insert statements into 2 does not help break them further and run them that many number of times....
I feel your 'Admin' should be more flexible with such things, explain to him about the situation, he might come up with a workaround. :wink:

Posted: Mon Feb 19, 2007 1:53 pm
by Mun
narasimha wrote:Ok, if your issue is that you cannot move a file into production, break your insert script into two, and run your job twice.
If breaking the insert statements into 2 does not help break them further and run them that many number of times....
I feel your 'Admin' should be more flexible with such things, explain to him about the situation, he might come up with a workaround. :wink:
--------------------------

Looks like I am able to break the SQL into 2 small chunks. One loads the first 50 columns and the other SQL loads the rest of the 50 columns. How can I merge them into one table?

Posted: Mon Feb 19, 2007 1:55 pm
by ray.wurlod
Inserts on first pass, updates on second pass.