i created one parallel job and that job just gets the data from the table.i ma getting the output as a string , and i used the column import to split that in to columns.
one of the column is coming in the format as " 31-jun-99"
but while in the column import , it's defined as "TIMESTAMP"
while running the job i am getting error as " Input buffer overrun at field "DATE1", at offset: 34
why i am getting this error msg??
Input buffer overrun at field " DATE1" at offset 3
Moderators: chulett, rschirm, roy
The problem is with wrong metadat or the datatype when you defined the column type as timestamp and the data you are getting is not time stamp.
When you are reading the data DataStage trying to read 19 char fro the time Stamp field but it is getting short data do it consumes data from next column and is loking for the next delimiter which is not found at expected position.
if you check the actual end of the timeStamp column would be the 35 th character of your record.
When you are reading the data DataStage trying to read 19 char fro the time Stamp field but it is getting short data do it consumes data from next column and is loking for the next delimiter which is not found at expected position.
if you check the actual end of the timeStamp column would be the 35 th character of your record.
Happy DataStaging