Passing parameters for columns
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 16
- Joined: Tue Apr 22, 2003 3:37 am
- Location: USA
- Contact:
Passing parameters for columns
Hi,
I have 50/60 different code tables for which structure is the same. All the tables have 3 columns and their data type and length are same but the column names are different for different tables.
We have DataStage version 6.0.0.21 on UNIX. Our target is Teradata.
If I can pass parameters for the table name and column names I can have one single job to load all of my code tables (running the job in multiple instance mode).
I can pass parameter for the table name but it gives me error while I try to pass parameters for the column names.
Is it possible to pass parameters for the column names? Did anyone face similar situation earlier?
Any help will be appreciated.
Thanks and regards.
Biswajit
I have 50/60 different code tables for which structure is the same. All the tables have 3 columns and their data type and length are same but the column names are different for different tables.
We have DataStage version 6.0.0.21 on UNIX. Our target is Teradata.
If I can pass parameters for the table name and column names I can have one single job to load all of my code tables (running the job in multiple instance mode).
I can pass parameter for the table name but it gives me error while I try to pass parameters for the column names.
Is it possible to pass parameters for the column names? Did anyone face similar situation earlier?
Any help will be appreciated.
Thanks and regards.
Biswajit
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
Have a play with custom SQL on your database stage. You should be able to modify the generated SQL statement to use job parameters for column names so that the DataStage column names are hard coded but the database column names are dynamic.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
-
- Participant
- Posts: 16
- Joined: Tue Apr 22, 2003 3:37 am
- Location: USA
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Nothing in DataStage. Can you create views with the "correct" column names for the three errant tables in Teradata, and access the views rather than the tables from DataStage?
Code: Select all
CREATE VIEW GoodView(col1,col2,col3,col4,col5) AS
SELECT col1,col2,badcol3,col4,col5 FROM tablename;
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
My bad. I suggested you use user defined SQL without testing to see whether it could be done. It's a pity it doesn't work.
You could try writing to bulk load files and dynamically generate a different bulk load control script for each lookup table using a routine. The dat file would be the same structure each time, only the control file would change.
You could try writing to bulk load files and dynamically generate a different bulk load control script for each lookup table using a routine. The dat file would be the same structure each time, only the control file would change.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
If all tables have the EXACT same columns - ordinally, data type, and length, but the NAME of each column is just different...
Load a representative set of metadata and try just USER-DEFINED SQL doing a "SELECT * FROM #tablename#". I don't think you're going to need to name the columns - user-defined SQL can be anything you want as long as the result set returns the right number of columns with the right data types.
Load a representative set of metadata and try just USER-DEFINED SQL doing a "SELECT * FROM #tablename#". I don't think you're going to need to name the columns - user-defined SQL can be anything you want as long as the result set returns the right number of columns with the right data types.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Guys, I think the target here is Teradata, and the stage type is the bulk loader. So column names are needed, because they are used to create the control script. That's why I suggested that the single-table (and therefore updatable) view might work. (Can you bulk load a view? Never tried.)
Another thought: does Teradata support synonyms for columns? That would definitely solve the original problem.
Another thought: does Teradata support synonyms for columns? That would definitely solve the original problem.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 25
- Joined: Thu Oct 02, 2003 8:57 am
-
- Participant
- Posts: 16
- Joined: Tue Apr 22, 2003 3:37 am
- Location: USA
- Contact:
-
- Participant
- Posts: 25
- Joined: Thu Oct 02, 2003 8:57 am
Hi.
Having thought about this a little more, it now seems easier than i first thought. I have only used this method using the "manual" load, however, and not using a direct insert, i cant see why it would work.
As you are already using the "custom" script, you are probably aware of the structure of the mload control script - the first section has database and account information, target table name, error tables AND the name of the columns being passed from DS; the second section (from the line that begins ".dml" ) is the SQL insert text, and finally the data loading commands.
The first section cant be edited in DS (though it can in Unix) and at the moment i dont think you need to alter it.
In the second part, which is the part you see in the "custom" tab, you can enter whatever you want (as long it is valid ). By default DS will use the column names that exist on the link to the stage, eg COL1, COL2. So the custom part of script will be something like;
.dml <...>
insert into dbt_test.t_target (
first_name,
last_name
) values ( col1, col2 );
where dbt_test and t_target are replaced by parameters that you pass in.
Try changing the SQL to read something like;
insert into dbt_test.t_target values (col1, col2);
then you should be able to use this in multiple instances.
Hope this helps.
Having thought about this a little more, it now seems easier than i first thought. I have only used this method using the "manual" load, however, and not using a direct insert, i cant see why it would work.
As you are already using the "custom" script, you are probably aware of the structure of the mload control script - the first section has database and account information, target table name, error tables AND the name of the columns being passed from DS; the second section (from the line that begins ".dml" ) is the SQL insert text, and finally the data loading commands.
The first section cant be edited in DS (though it can in Unix) and at the moment i dont think you need to alter it.
In the second part, which is the part you see in the "custom" tab, you can enter whatever you want (as long it is valid ). By default DS will use the column names that exist on the link to the stage, eg COL1, COL2. So the custom part of script will be something like;
.dml <...>
insert into dbt_test.t_target (
first_name,
last_name
) values ( col1, col2 );
where dbt_test and t_target are replaced by parameters that you pass in.
Try changing the SQL to read something like;
insert into dbt_test.t_target values (col1, col2);
then you should be able to use this in multiple instances.
Hope this helps.
-
- Participant
- Posts: 25
- Joined: Thu Oct 02, 2003 8:57 am
Just an FYI... you can go back and edit your post to correct spelling or other contextual errors rather than post a correction. Trust me, it comes it handy. You'll see it as an option at the top of the posts that are yours, along with the other options like 'Reply with quote'.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 25
- Joined: Thu Oct 02, 2003 8:57 am