Page 1 of 1

Urgent-Error reading from Sequential file

Posted: Fri Nov 19, 2004 1:45 pm
by monaveed
Hi,

I created a simple Datastage job which reads from a Sequential file that is loacted on UNIX. I am not able to view the data from this file nor does the data get loaded in the table when I run the job.

Any help would be highly appreciated.

The error I get is given below.

Thanks.

##I TFCN 000001 13:42:49(000) <main_program>
Ascential DataStage(tm) Enterprise Edition 7.1r2
Copyright (c) 2004, 1997-2004 Ascential Software Corporation.
All Rights Reserved


##I TUTL 000031 13:42:49(001) <main_program> The open files limit is 2000; raising to 2147483647.
##I TOSH 000002 13:42:49(002) <main_program> orchgeneral: loaded
##I TOSH 000002 13:42:49(003) <main_program> orchsort: loaded
##I TOSH 000002 13:42:49(004) <main_program> orchstats: loaded
##I TFSC 000001 13:42:49(005) <main_program> APT configuration file: /apps/Ascential/DataStage/Configurations/ENT_DB2.apt
##W TCOS 000049 13:42:49(006) <main_program> Parameter specified but not used in flow: _APT_CONFIG_FILE
##W TCOS 000049 13:42:49(007) <main_program> Parameter specified but not used in flow: DSProjectMapName
##W TOIX 000000 13:42:50(000) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 81
##W TOIX 000154 13:42:50(001) <Sequential_File_1,0> Import warning at record 0:
##W TOIX 000018 13:42:50(002) <Sequential_File_1,0> Import unsuccessful at record 0:
##W TOIX 000000 13:42:50(003) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 81
##W TOIX 000154 13:42:50(004) <Sequential_File_1,0> Import warning at record 1:
##W TOIX 000018 13:42:50(005) <Sequential_File_1,0> Import unsuccessful at record 1:
##W TOIX 000000 13:42:50(006) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 80
##W TOIX 000154 13:42:50(007) <Sequential_File_1,0> Import warning at record 2:
##W TOIX 000018 13:42:50(008) <Sequential_File_1,0> Import unsuccessful at record 2:
##W TOIX 000000 13:42:50(009) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 91
##W TOIX 000154 13:42:50(010) <Sequential_File_1,0> Import warning at record 3:
##W TOIX 000018 13:42:50(011) <Sequential_File_1,0> Import unsuccessful at record 3:
##W TOIX 000000 13:42:50(012) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 91
##W TOIX 000154 13:42:50(013) <Sequential_File_1,0> Import warning at record 4:
##W TOIX 000018 13:42:50(014) <Sequential_File_1,0> Import unsuccessful at record 4:
##I TOIX 000157 13:42:50(015) <Sequential_File_1,0> (no further reports will be generated from this partition until successful import)
##I TOIX 000163 13:42:50(016) <Sequential_File_1,0> Import complete. 0 records imported successfully, 70 rejected.

Posted: Fri Nov 19, 2004 3:45 pm
by coolkhan08
The problem looks like your meta data is not in agreement with your seq file. The offset is bieng caused due to the meta data and seq file mismatch.

Re: Urgent-Error reading from Sequential file

Posted: Fri Nov 19, 2004 5:09 pm
by DataStageCnu
Hi Mona,

It should be Meta data mismatch problem.. Correct length and scale to first field.


Srini
BB

Re: Urgent-Error reading from Sequential file

Posted: Sun Nov 21, 2004 2:35 am
by legendkiller
this problem has occured due to metadata mismatch. specially due to charecter feilds
monaveed wrote:Hi,

I created a simple Datastage job which reads from a Sequential file that is loacted on UNIX. I am not able to view the data from this file nor does the data get loaded in the table when I run the job.

Any help would be highly appreciated.

The error I get is given below.

Thanks.

##I TFCN 000001 13:42:49(000) <main_program>
Ascential DataStage(tm) Enterprise Edition 7.1r2
Copyright (c) 2004, 1997-2004 Ascential Software Corporation.
All Rights Reserved


##I TUTL 000031 13:42:49(001) <main_program> The open files limit is 2000; raising to 2147483647.
##I TOSH 000002 13:42:49(002) <main_program> orchgeneral: loaded
##I TOSH 000002 13:42:49(003) <main_program> orchsort: loaded
##I TOSH 000002 13:42:49(004) <main_program> orchstats: loaded
##I TFSC 000001 13:42:49(005) <main_program> APT configuration file: /apps/Ascential/DataStage/Configurations/ENT_DB2.apt
##W TCOS 000049 13:42:49(006) <main_program> Parameter specified but not used in flow: _APT_CONFIG_FILE
##W TCOS 000049 13:42:49(007) <main_program> Parameter specified but not used in flow: DSProjectMapName
##W TOIX 000000 13:42:50(000) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 81
##W TOIX 000154 13:42:50(001) <Sequential_File_1,0> Import warning at record 0:
##W TOIX 000018 13:42:50(002) <Sequential_File_1,0> Import unsuccessful at record 0:
##W TOIX 000000 13:42:50(003) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 81
##W TOIX 000154 13:42:50(004) <Sequential_File_1,0> Import warning at record 1:
##W TOIX 000018 13:42:50(005) <Sequential_File_1,0> Import unsuccessful at record 1:
##W TOIX 000000 13:42:50(006) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 80
##W TOIX 000154 13:42:50(007) <Sequential_File_1,0> Import warning at record 2:
##W TOIX 000018 13:42:50(008) <Sequential_File_1,0> Import unsuccessful at record 2:
##W TOIX 000000 13:42:50(009) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 91
##W TOIX 000154 13:42:50(010) <Sequential_File_1,0> Import warning at record 3:
##W TOIX 000018 13:42:50(011) <Sequential_File_1,0> Import unsuccessful at record 3:
##W TOIX 000000 13:42:50(012) <Sequential_File_1,0> Field "cr_cust_i" delimiter not seen, at offset: 91
##W TOIX 000154 13:42:50(013) <Sequential_File_1,0> Import warning at record 4:
##W TOIX 000018 13:42:50(014) <Sequential_File_1,0> Import unsuccessful at record 4:
##I TOIX 000157 13:42:50(015) <Sequential_File_1,0> (no further reports will be generated from this partition until successful import)
##I TOIX 000163 13:42:50(016) <Sequential_File_1,0> Import complete. 0 records imported successfully, 70 rejected.

Posted: Sun Nov 21, 2004 7:26 pm
by vbeeram
handle the null vals properly(use default valus) and check the metadata.


Regards
vBeeram