teradata enterprise with unknown destination columns
Moderators: chulett, rschirm, roy
teradata enterprise with unknown destination columns
Hello,
I have some trouble with a job using teradata enterprise,
I have x columns in input and I want to load them into a destination table using x + n columns (where n columns are technicals,unknown from datastage.theses columns may evolve).
How can i load my data using destination table default values without declare them into my job.
the stage use an orch_work temporary table create automaticaly by enterprise teradata stage whith sql code :
CREATE TABLE ORCH_WORK_10af1daa AS ( SELECT * FROM mytable ) WITH NO DATA PRIMARY INDEX ( col1)
it would be great if datastage create orch_cork table using
CREATE TABLE ORCH_WORK_10af1daa AS mytable WITH NO DATA PRIMARY INDEX ( col1)
this second method keep defaults values.
thanks for your answers.
I have some trouble with a job using teradata enterprise,
I have x columns in input and I want to load them into a destination table using x + n columns (where n columns are technicals,unknown from datastage.theses columns may evolve).
How can i load my data using destination table default values without declare them into my job.
the stage use an orch_work temporary table create automaticaly by enterprise teradata stage whith sql code :
CREATE TABLE ORCH_WORK_10af1daa AS ( SELECT * FROM mytable ) WITH NO DATA PRIMARY INDEX ( col1)
it would be great if datastage create orch_cork table using
CREATE TABLE ORCH_WORK_10af1daa AS mytable WITH NO DATA PRIMARY INDEX ( col1)
this second method keep defaults values.
thanks for your answers.
If you want default values then set them up at the database level and dont pass that column during insert. The database will populate that column for every inserted record. Usually this is implemented for current timestamp which the database generates on its own if no value is passed. Is that what you want ?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Thanks for your answer, i have already used default value and the problem is that I haven't past this column to insert.
stage works like this
My file's columns are a,b,c
my destination table columns are a,b,c,d,e whith default value 0 or current_timestamp (for exemple)
step a:
when running the stage create an orch_work table with
then orch_work table doesn't keep default value (verified with qrylog)
orch_work table has then columns a,b,c,d,e whit no default values
step b:
datafile is loaded in orch_work whith values a,b,c,null,null in column (a,b,c,d,e) because of no default value
step c:
stage insert data into dest table whith
this step fail because of condition not null in my destination table.
whithout the not null condition "null"is inserted in my dest_table, default value is not used....
step b and c are automatics
stage works like this
My file's columns are a,b,c
my destination table columns are a,b,c,d,e whith default value 0 or current_timestamp (for exemple)
step a:
when running the stage create an orch_work table with
Code: Select all
CREATE TABLE ORCH_WORK_10af1daa AS ( SELECT * FROM mytable ) WITH NO DATA PRIMARY INDEX ( col1)
orch_work table has then columns a,b,c,d,e whit no default values
step b:
datafile is loaded in orch_work whith values a,b,c,null,null in column (a,b,c,d,e) because of no default value
step c:
stage insert data into dest table whith
Code: Select all
INSERT INTO my_table SELECT * FROM ORCH_WORK_10af1daa
whithout the not null condition "null"is inserted in my dest_table, default value is not used....
step b and c are automatics
Well, in your case the default values need to be set in your work table while creating the table, since the value is being populated into the work table from your actual table.
Why are you using a work table in between. From what I am seeing, you are creating a work table, loading your dataset into it just to unload it later to your actual table. Get rid of your work table. You dont need it.
Why are you using a work table in between. From what I am seeing, you are creating a work table, loading your dataset into it just to unload it later to your actual table. Get rid of your work table. You dont need it.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Hmm, dont send those columns in at all. I mean, remove them from the columns tab. This way they will not be present in the work table and hence no value will be sent to your actual table which will in turn populate the default values.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Now see it should work even then. If the columns dont exist in the metadata, they will not be created in the orch table and hence not passed to the actual table.
I dont know why its not working. WHat happens when you do an INSERT from your favourite sql tool ? Do the default values get generated?
I dont know why its not working. WHat happens when you do an INSERT from your favourite sql tool ? Do the default values get generated?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
In fact, the stage ( i dont know why) create (automaticaly) the temp table orch_work from destination table instead of use metadata...
this creation is shown with sql code shown upper.
When i manually insert data whit sql tools whithout specifying hided columns default values are used....
Maybe the answer is in configuration of the stage to use metadata instead of dest_table, but i don't know the parameters to change.
here's a sample of my log for information:
this creation is shown with sql code shown upper.
When i manually insert data whit sql tools whithout specifying hided columns default values are used....
Maybe the answer is in configuration of the stage to use metadata instead of dest_table, but i don't know the parameters to change.
here's a sample of my log for information:
Code: Select all
LOG REPORT FOR JOB: JxTTCHG_ROVoix_V2.7100004
8098 STARTED Thu Feb 8 17:28:13 2007
Starting Job JxTTCHG_ROVoix_V2.7100004. (...)
8099 INFO Thu Feb 8 17:28:13 2007
Environment variable settings: (...)
8100 INFO Thu Feb 8 17:28:15 2007
Parallel job initiated (...)
8101 INFO Thu Feb 8 17:28:15 2007
Advanced runtime options used: -nopartinsertion -nosortinsertion
8102 INFO Thu Feb 8 17:28:15 2007
main_program: Ascential DataStage(tm) Enterprise Edition 7.5.0.1 (...)
8103 INFO Thu Feb 8 17:28:16 2007
main_program: orchgeneral: loaded (...)
8104 INFO Thu Feb 8 17:28:17 2007
main_program: APT configuration file: /product/datastage/Ascential/DataStage/Configurations/sumo.apt (...)
8105 INFO Thu Feb 8 17:28:31 2007
Ci_ImportSchema,0: Field import complete. 0 records converted successfully, 0 rejected.
8106 INFO Thu Feb 8 17:28:31 2007
Sf_Source,0: Import complete. 3 records imported successfully, 0 rejected.
8107 INFO Thu Feb 8 17:28:31 2007
Sf_Rejet_Decodage,0: Export complete. 0 records exported successfully, 0 rejected.
8108 INFO Thu Feb 8 17:28:31 2007
Ci_ImportSchema,1: Field import complete. 1 records converted successfully, 0 rejected.
8109 INFO Thu Feb 8 17:28:31 2007
Sf_Rejet_Ci_ImportSchema,0: Export complete. 0 records exported successfully, 0 rejected.
8110 INFO Thu Feb 8 17:28:45 2007
APT_CombinedOperatorController(1),1: 1 rows written to reject link: Ln_StockageMarginal
8111 INFO Thu Feb 8 17:28:45 2007
TER_StockageReg,0: 0 total records processed.
8112 INFO Thu Feb 8 17:28:48 2007
TER_StockageMarginal,0: 1 total records processed.
8113 INFO Thu Feb 8 17:28:48 2007
TER_StockageMarginal,1: 0 total records processed.
8114 INFO Thu Feb 8 17:28:55 2007
TER_StockageReg,1: 0 total records processed.
8115 INFO Thu Feb 8 17:28:55 2007
TER_StockageMarginal,0: Orchestrate Teradata Write Operator Summary (...)
8116 INFO Thu Feb 8 17:29:02 2007
TER_StockageReg,0: Orchestrate Teradata Write Operator Summary (...)
8117 INFO Thu Feb 8 17:29:07 2007
TER_StockageReg: Orchestrate Teradata Write Operator Summary (...)
8118 FATAL Thu Feb 8 17:29:07 2007
TER_StockageMarginal: TeraUtils:DB Call Failure(success check) Info = 0, Code = 3604, Message = Cannot place a null value in a NOT NULL field.
8119 FATAL Thu Feb 8 17:29:07 2007
TER_StockageMarginal: DBCAREA msg_txt: Completed successfully.
8120 FATAL Thu Feb 8 17:29:07 2007
TER_StockageMarginal: TeraUtils:ExecuteImmediate failed for 'INSERT INTO PUITS_DEV.VROSRR_EXCP SELECT * FROM ORCH_WORK_6f700073'. Error Code = 3604
8121 FATAL Thu Feb 8 17:29:07 2007
TER_StockageMarginal: DBCAREA msg_txt: Completed successfully.
8122 FATAL Thu Feb 8 17:29:07 2007
TER_StockageMarginal: execDirect failed for "INSERT INTO PUITS_DEV.VROSRR_EXCP SELECT * FROM ORCH_WORK_6f700073" (...)
8123 INFO Thu Feb 8 17:29:07 2007
TER_StockageMarginal: Orchestrate Teradata Write Operator Summary (...)
8124 INFO Thu Feb 8 17:29:07 2007
main_program: Step execution finished with status = OK.
8125 INFO Thu Feb 8 17:29:09 2007
main_program: Startup Time 0:26 Production Run Time 0:24
8126 INFO Thu Feb 8 17:29:09 2007
Parallel job reports successful completion
8127 STARTED Thu Feb 8 17:29:09 2007
Finished Job JxTTCHG_ROVoix_V2.7100004.
8128 BATCH Thu Feb 8 17:29:09 2007
(JsTTCHG_ROVoix_V2) <- JxTTCHG_ROVoix_V2.7100004: Job under control finished.
8129 RESET Thu Feb 8 17:29:09 2007
Log prior to 2007-02-08 cleared by user
8130 RESET Thu Feb 8 17:29:09 2007
Auto-purged 85 log entries.
Maybe we are jumping ahead here. From the columns that you are specifying, are you sending an NULLS to NotNull column? Maybe its complaining on that. Check the metadata in your dataset and your target teradata stage?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Well, I am running out of ideas now. Build a simple server job and try to do the same with teradata api. If you are getting the same problem there as well then get in touch with support and see what they have to say.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Thanks for your time,
we have open a case with support.
By now we think there's a bug with the "select list" property, when we select col1 in this list, the insert request is
but it should be
Let's see what the support say about it...
I'll post the solution here if there's any....
we have open a case with support.
By now we think there's a bug with the "select list" property, when we select col1 in this list, the insert request is
Code: Select all
INSERT INTO PUITS_DEV.VROSRR_EXCP(col1) SELECT * FROM ORCH_WORK_6f700073
Code: Select all
INSERT INTO PUITS_DEV.VROSRR_EXCP(col1) SELECT col1 FROM ORCH_WORK_6f700073
I'll post the solution here if there's any....