Page 1 of 2

Help: please has import error and no default value; data:

Posted: Tue Jun 06, 2006 1:26 pm
by mctny
hi everyone,

I have a simple loading job, from flat files ( 5 of them) to an oracle table. it compiles, and doesnot give any run time error either, however it gives some warnings in the run time but it doesnot load any records. do you have any idea, why?

The last portion of the log is below, the first part are the same, it repeats 3 times.

F_Response_From_Files,3: Field "STATE_AGENCY_KEY" has import error and no default value; data: {" 4 "}, at offset: 0
F_Response_From_Files,3: Import warning at record 4.
F_Response_From_Files,3: Import unsuccessful at record 4.
F_Response_From_Files,3: No further reports will be generated from this partition until a successful import.

Re: Help: please has import error and no default value; data

Posted: Tue Jun 06, 2006 2:57 pm
by sud
Hey,

What is the datatype of Field "STATE_AGENCY_KEY" ? and can you please find out the record 4 and post it.

:)

Re: Help: please has import error and no default value; data

Posted: Tue Jun 06, 2006 2:59 pm
by mctny
sud wrote:Hey,

What is the datatype of Field "STATE_AGENCY_KEY" ? and can you please find out the record 4 and post it.

:)
Decimal, length 15 and Nullable

Thank you for the reply

Posted: Tue Jun 06, 2006 3:05 pm
by kool78
hi

Edit the Metadata for the column "STATE_AGENCY_KEY" and give a default value in the nullable value property.

Posted: Tue Jun 06, 2006 3:14 pm
by ray.wurlod
The value " 4 " is reported. This suggests that the field has surrounding space characters, and perchance its data type is a numeric data type, such as Integer of some kind. You need to fix either the data or the metadata.

Posted: Tue Jun 06, 2006 3:18 pm
by sud
Hi,

Please check the format tab of the sequential file. Go to the Field Defaults property and set the Null field value property to the default you want.

But even before that what is the data in record 4 ? Please check that and see if the problem is because of null value.

8)

Posted: Tue Jun 06, 2006 3:22 pm
by mctny
ray.wurlod wrote:The value " 4 " is reported. This suggests that the field has surrounding space characters, and perchance its data type is a numeric data type, such as Integer of some kind. You need to fix either the data or the metadata.
Actually there are 3 log records that are the same, so it is not exactly 4th record. it tries to import 1st 2nd 3rd and 4th , none are succesfull and it stops but abort the running program successfully, no records are extracted from the flat file and hence none are loaded to oracle.

Posted: Tue Jun 06, 2006 3:26 pm
by sud
If it is a read error then you should try to do a view data in the sequential file stage and see if the data is read properly in the stage indicated in the message. Anyways, can u post a few records of the file as it is, or check what Ray indicated ... the possibility of spaces et al being read into numeric datatype.

Posted: Tue Jun 06, 2006 3:35 pm
by mctny
sud wrote:If it is a read error then you should try to do a view data in the sequential file stage and see if the data is read properly in the stage indicated in the message. Anyways, can u post a few records of the file as it is, or check what Ray indicated ... the possibility of spaces et al being read into numeric datatype.
I looked at the data it looks ok, | is my field delimiter, and I use none for quote by the way I also use ascii value 254 to handle null value, do u think it cause any problem. 254 is not much used character.
thanks


here is the first few records from the file

1|25455|42|144|495554|34|1904|Certified|| | 00000000000000000000000000000000000000.| 0000000000000000000000000000.0000000000|555968_5925301_AL_1.4.3.1.xls|
1|25457|42|144|495555|34|1905|Certified|Please note that this state Dropout Rate rather than the Graduation Rate.| | 00000000000000000000000000000000000000.| 0000000000000000000000000000.0000000000|555970_1229573_AL_1.4.4.1.xls|
2|25455|42|144|495556|34|1906|Certified|| | 00000000000000000000000000000000000000.| 0000000000000000000000000000.0000000000|557791_2020905_Sch_Imp_Chart_1_4_3_1.xls|
2|25457|42|144|495558|34|1907|Certified|| | 00000000000000000000000000000000000000.| 0000000000000000000000000000.0000000000|557799_214551_Dist_Imp_Chart_1_4_4_1.xls|
4|25455|42|144|495559|34|1908|Certified|| | 00000000000000000000000000000000000000.| 0000000000000000000000000000.0000000000|559613_436153_FY_2005_CSPR_Sch_Impr_w_NCES_ID_KS_02172006.xls|

Posted: Tue Jun 06, 2006 3:40 pm
by pavankvk
TRY TO READ THE SEQUENTIAL FILE with following properties.

Delimiter pipe
Datatype cHAR
Nullable Unknown
Dont specify any length

Posted: Tue Jun 06, 2006 3:40 pm
by ray.wurlod
Please verify that you have correctly specified the delimiter character as "|" and either 14 fields or 13 fields and a final delimiter of "|".

Posted: Tue Jun 06, 2006 3:52 pm
by mctny
ray.wurlod wrote:Please verify that you have correctly specified the delimiter character as "|" and either 14 fields or 13 fields and a final delimiter of "|".
that's right my staging job was not always putting delimiters. I am using end as a final delimiter is it ok?

I will look at this now, but thank all your help

Posted: Tue Jun 06, 2006 8:09 pm
by ray.wurlod
From your data listing I would recommend trying with "|" as the final delimiter, unless you have 14 fields defined in the record layout.

Posted: Wed Jun 07, 2006 12:10 pm
by mctny
ray.wurlod wrote:From your data listing I would recommend trying with "|" as the final delimiter, unless you have 14 fields defined in the record layout.
I am still having problems, my ET job is writing to a seq file and it shows it is written 118 rows, but when I read it in a load job, it reads 125 rows of which 11 records are going to reject in the input ( in the rejects file for reading the seq file), I went to unix and check the file with wc and it shows 125 records, so the problem is with my ET job which is saying it has put 118 rows but it actually is putting 125 rows,
any thoughts?

thank you very much in advance

Posted: Wed Jun 07, 2006 3:22 pm
by ray.wurlod
Are there any line terminator characters in your data? That is, is any of the "comments" multi-line in the source data?