Hi
I have 2 questions for everyone.
1. I am trying to do a join on data coming from a oracle table with another oracle table using the hash file and transformer stages. The key column in hash file, is it like the primary key in database or is it used just to do a join with other columns from a different table in transformer stage. What if the key column has same values across all records. What the key column in hash file do in such a case.
oracle
|
V
hash file
|
V
Transformer(join here)----->oracle
^
|
Oracle
2. I am using user defined sql in the Oracle stage to get my values. I need sysdate to be populated into a target table. I am having a lot of difficulty trying to convert and load a sysdate and another field with date in dd/mm/yyyy format to orace format. What sql datatype would you use in the oracle stage to convert the sysdate into a suitable format that can be loaded into another oracle stage. I have tried different combinations of datatypes (char,date,timestamp)and sql functions like to_char,to_date to make this work but the database is still erroring out my records.
In what cases would you use date, timestamp, char or time.
Please inform
Thanks
Babu
Key value for hash file
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
1.) In simple terms, an hash-file is a "indexed sequence file with a key and its associated attributes with 'records overwrite' property switched on".
So if you insert 100 rows with the same key, only the last one will exist in the hash-file.
Creation of hash-file can be done on the fly by directly using the hash-file in your canvas. DataStage will create it during runtime - if not already present and you have create option switched on. Alternatively you can identify the need of your hash file sizing and structuring and create it before its usage.
I will not go too much detail into it for you at this time.
2.) A 'generated query' from Oracle embeds a TO_CHAR function automatically. If you are using a 'user defined SQL', you may have to perform this yourself. You can convert your dates into strings and reconvert it using TO_DATE on the target. To check the data, you can write it from intermediate transformers into sequential files.
So if you insert 100 rows with the same key, only the last one will exist in the hash-file.
Creation of hash-file can be done on the fly by directly using the hash-file in your canvas. DataStage will create it during runtime - if not already present and you have create option switched on. Alternatively you can identify the need of your hash file sizing and structuring and create it before its usage.
I will not go too much detail into it for you at this time.
2.) A 'generated query' from Oracle embeds a TO_CHAR function automatically. If you are using a 'user defined SQL', you may have to perform this yourself. You can convert your dates into strings and reconvert it using TO_DATE on the target. To check the data, you can write it from intermediate transformers into sequential files.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
If the Oracle column is a date column, you can just include the word SYSDATE in the SQL where the value for that column is being provided.
No need, in that case, to deliver a value from DataStage, therefore not need to worry about conversion and so on. For example:
Note that there are six columns in the row being inserted, but only five parameter markers in the INSERT statement. This means that there are only five columns delivered from DataStage.
No need, in that case, to deliver a value from DataStage, therefore not need to worry about conversion and so on. For example:
Code: Select all
INSERT INTO tablename (col1, col2, col3, col4, col5, col6)
VALUES (:1, :2, SYSDATE, :3, :4, :5);
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.