Hi,
I have an ODBC(Oracle DSN) for sourcing data.When I write to the target sequential file stage.
The following happens
1.Job Aborts
2.It sources some records.
3.It gives me the following error.
src_ORA_Desc..Seq_Trg_Desc.Outto_Desc: read_delimited() - row 440, column CL_CORE_CD, required column missing.
4.I have a target with these records.
I analysed the target and got the following consclusions.
1.Whichever row datastage is reading properly it writes to the target.
2.If it cannot read a part of row(meaning 5 columns it reads and a part of the 6th column) it aborts.
Any inupts what could be the reason most appreciated.
Thanks in advance.
Regards,
Bala.
read_delimited() in ODBC stage
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Apparently not; it can also be thrown reading a stream of data.
Reading the entire error message, we see that in the job src_ORA_Desc (no Invocation ID) the stage Seq_Trg_Desc and link Outto_Desc is generating the error.
Can you please confirm your job design, with stage and link names? In particular, is there any active stage - such as a Transformer - in the job design?
Reading the entire error message, we see that in the job src_ORA_Desc (no Invocation ID) the stage Seq_Trg_Desc and link Outto_Desc is generating the error.
Can you please confirm your job design, with stage and link names? In particular, is there any active stage - such as a Transformer - in the job design?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I realize what the posted messages say, just don't really make sense to me given the explanation of the job design. Seems odd that error would occur reading from an Oracle database but then I don't use ODBC, so perhaps that is a 'feature' of walking that path. [shrug]
Curious how this all works out now.
Curious how this all works out now.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Hi,
Thanks for the reply.It was absoultely an issue with the source columns which was having a CLOB datatype.I have to substring the columns instead of type casting it from CLOB to varchar.Once when I type cast it could read all the records.Apologies if I was not clear in my previous post.
But still I guess there are some data related issues are there datastage throws the same error for some more rows.Infact excluded those datas in the where clause it could read all the datas properly and write it properly.
Need to analyse the data futher and frame a rule at the ETL.
Regards,
Bala.
Thanks for the reply.It was absoultely an issue with the source columns which was having a CLOB datatype.I have to substring the columns instead of type casting it from CLOB to varchar.Once when I type cast it could read all the records.Apologies if I was not clear in my previous post.
But still I guess there are some data related issues are there datastage throws the same error for some more rows.Infact excluded those datas in the where clause it could read all the datas properly and write it properly.
Need to analyse the data futher and frame a rule at the ETL.
Regards,
Bala.