Issue in Reading Data in Sequential file
Moderators: chulett, rschirm, roy
Issue in Reading Data in Sequential file
Hi All,
I have loaded the Data into the Target Sequentil File.But when I am trying to Read the Same its giving the Below Error.
##E IIS-DSEE-TFOR-00089 13:01:15(704) <_ABORT_IDENT_,0> The runLocally() of the operator failed.
##E IIS-DSEE-TFOR-00089 13:01:15(706) <APT_CombinedOperatorController,0> The runLocally() of the operator failed.
##E IIS-DSEE-TFPM-00040 13:01:15(707) <APT_CombinedOperatorController,0> Operator terminated abnormally: runLocally did not return APT_StatusOk
##E IIS-DSEE-TFSC-00011 13:01:21(000) <main_program> Step execution finished with status = FAILED
I tried different ways to Understand the Issue by analyzing the Data that I have loaded from Source.And I suspected that there is Some Problem with the data in the columns in source which has its Datatype as TIMESTAMP.
So,I have reloaded the Data by removing the Fileds with TIMESTAMP datatype and agian tried to View the Data and could able to View it without any Issue.
Could any one give your thought what is the Problem when I tried to view the data with TIMESTAMP columns.
Thx In Advance.
Manu
Datastage Devoloper
Datastage Devoloper
Have you checked the file on your system? Are you getting any warning messages in your logs? It sounds like your job may actually be rejecting the records due to invalid type.
You get that set of error messages when no records are returned... rather annoying but doesn't seem to be one they are looking to fix any time soon.
You get that set of error messages when no records are returned... rather annoying but doesn't seem to be one they are looking to fix any time soon.
Check out the values of the data of the TIMESTAMP field whether it complies with that of the Default TIMESTAMP value (Job's default settings). If not, change the default setting to match to that of the source data or you can import the TIMESTAMP data also as 'Varchar' and then convert to the desired format in the transformer using StringtoTimestamp() function.
I generally import as Varchar and then convert in the later stages and it works perfectly!!
Regards,
Divya
I generally import as Varchar and then convert in the later stages and it works perfectly!!
Regards,
Divya
Check out the values of the data of the TIMESTAMP field whether it complies with that of the Default TIMESTAMP value (Job's default settings). If not, change the default setting to match to that of the source data or you can import the TIMESTAMP data also as 'Varchar' and then convert to the desired format in the transformer using StringtoTimestamp() function.
I generally import as Varchar and then convert in the later stages and it works perfectly!!
Regards,
Divya
I generally import as Varchar and then convert in the later stages and it works perfectly!!
Regards,
Divya