Reading the Copy Book in the Complex Flat File stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
dhavak
Participant
Posts: 12
Joined: Thu Mar 13, 2008 12:16 am
Location: Hyderabad

Reading the Copy Book in the Complex Flat File stage

Post by dhavak »

Hi,
I am reading the Mainframe Copy Book using the Complex Flat FIle stage.
We are having the almost 150 columns. While reading the file i am getting the below error.

"##W TOIX 000000 23:18:25(000) <Complex_Flat_File_0,0> Field "EASN_VENDOR_I" has import error and no default value; data: {0 0 0 0 0 0 0 0 0}, at offset: 0
##W TOIX 000154 23:18:25(001) <Complex_Flat_File_0,0> Import warning at record 0.
##W TOIX 000018 23:18:25(002) <Complex_Flat_File_0,0> Import unsuccessful at record 0.
##W TOIX 000000 23:18:25(003) <Complex_Flat_File_0,0> Field "EASN_VENDOR_I" has import error and no default value; data: {0a 0 0 0 0 0 0 0 0}, at offset: 0
##W TOIX 000154 23:18:25(004) <Complex_Flat_File_0,0> Import warning at record 1.
##W TOIX 000018 23:18:25(005) <Complex_Flat_File_0,0> Import unsuccessful at record 1.
##W TOIX 000000 23:18:25(006) <Complex_Flat_File_0,0> Field "EASN_VENDOR_I" has import error and no default value; data: {20 0a 0 0 0 0 0 0 0}, at offset: 0
##W TOIX 000154 23:18:25(007) <Complex_Flat_File_0,0> Import warning at record 2.
##W TOIX 000018 23:18:25(008) <Complex_Flat_File_0,0> Import unsuccessful at record 2.
##W TOIX 000000 23:18:25(009) <Complex_Flat_File_0,0> Field "EASN_VENDOR_I" has import error and no default value; data: {00 00 0a 0 0 0 0 0 0}, at offset: 0
##W TOIX 000154 23:18:25(010) <Complex_Flat_File_0,0> Import warning at record 3.
##W TOIX 000018 23:18:25(011) <Complex_Flat_File_0,0> Import unsuccessful at record 3.
##W TOIX 000000 23:18:25(012) <Complex_Flat_File_0,0> Field "EASN_VENDOR_I" has import error and no default value; data: {20 20 20 0a 0 0 0 0 0}, at offset: 0
##W TOIX 000154 23:18:25(013) <Complex_Flat_File_0,0> Import warning at record 4.
##W TOIX 000018 23:18:25(014) <Complex_Flat_File_0,0> Import unsuccessful at record 4.
##I TOIX 000193 23:18:25(015) <Complex_Flat_File_0,0> No further reports will be generated from this partition until a successful import.
##I TOIX 000156 23:18:25(016) <Complex_Flat_File_0,0> Progress: 10 percent.
##I TOIX 000156 23:18:25(017) <Complex_Flat_File_0,0> Progress: 20 percent.
##I TOIX 000156 23:18:25(018) <Complex_Flat_File_0,0> Progress: 30 percent.
##I TOIX 000156 23:18:25(019) <Complex_Flat_File_0,0> Progress: 40 percent.
##I TOIX 000156 23:18:25(020) <Complex_Flat_File_0,0> Progress: 50 percent.
##I TOIX 000156 23:18:25(021) <Complex_Flat_File_0,0> Progress: 60 percent.
##I TOIX 000156 23:18:25(022) <Complex_Flat_File_0,0> Progress: 70 percent.
##I TOIX 000156 23:18:25(023) <Complex_Flat_File_0,0> Progress: 80 percent.
##I TOIX 000156 23:18:25(024) <Complex_Flat_File_0,0> Progress: 90 percent.
>##E TOIX 000159 23:18:25(025) <Complex_Flat_File_0,0> Short read encountered on import; this most likely indicates one of the following possibilities:
>1) the import schema you specified is incorrect
>2) invalid data (the schema is correct, but there is an error in the data).
##I TOIX 000094 23:18:25(026) <Complex_Flat_File_0,0> Output 0 produced 0 records.
>##E TOIX 000043 23:18:25(027) <Complex_Flat_File_0,0> Expected 700 bytes, got 99.
>##E TOIX 000179 23:18:25(028) <Complex_Flat_File_0,0> Import error at record 7810.
>##E TOIX 000089 23:18:25(029) <Complex_Flat_File_0,0> The runLocally() of the operator failed."

Please help me what is this error about and how can this error be avoided.
Thanks in Advance.


Thanks,
Dhaval
devidotcom
Participant
Posts: 247
Joined: Thu Apr 27, 2006 6:38 am
Location: Hyderabad

Post by devidotcom »

Make sure you have flatten all the arrays when you load the metadata.
If not do this.

Also try reading this copybook using a sequential file stage if it does not contain words like Occurs, redefines, defines... etc.

The error you obtained is because the record length is not matching with the metadata you have.

Devi
dhavak
Participant
Posts: 12
Joined: Thu Mar 13, 2008 12:16 am
Location: Hyderabad

Post by dhavak »

Hi,
We have the Redefine clause in the Copy Book ehnce we are using the Complex File stage.

Also, we have flatten the arrays before loading the metadata.
Is there any other option we can make it work?

Thanks,
Dhaval
devidotcom
Participant
Posts: 247
Joined: Thu Apr 27, 2006 6:38 am
Location: Hyderabad

Post by devidotcom »

Is it a fixed width or variable record file?

What are the settings on File Options tab.
Make sure you have selected the correct option there for the record type.

Also check for record options tab.

Devi
dhavak
Participant
Posts: 12
Joined: Thu Mar 13, 2008 12:16 am
Location: Hyderabad

Post by dhavak »

Hi,
I have checked the records option it is Fixed Length.
Can you help me with how to set the option nullable in the copy book
devidotcom
Participant
Posts: 247
Joined: Thu Apr 27, 2006 6:38 am
Location: Hyderabad

Post by devidotcom »

Sure...

For every column in the CFF stage in the properties options we have Nullable option. Add appropriate value for a field. For eg. If the column is Integer (5) then add 5 zeros

Devi
dhavak
Participant
Posts: 12
Joined: Thu Mar 13, 2008 12:16 am
Location: Hyderabad

Post by dhavak »

Hi,
The nullable option is not working for the dataset.
It is giving error if we are setting it to Yes.
Also, when we are fetching the data, our data in the second row is getting shifted by 1, 2, 3,4 and so on for the subsequent row?
Can you please help us out in this situation?

The data should be like
Col1 Col2 col3
000000000. zzzzzzzzz xxxxxxxxx
000000000. zzzzzzzzz xxxxxxxxx
000000000. zzzzzzzzz xxxxxxxxx
000000000. zzzzzzzzz xxxxxxxxx
000000000. zzzzzzzzz xxxxxxxxx

but we are getting the data in the below format
Col1 Col2 col3
000000000. zzzzzzzzz xxxxxxxxx
000000000. 0zzzzzzzz zxxxxxxxx
000000000. 00zzzzzzz zzxxxxxxx
000000000. 000zzzzzz zzzxxxxxx
000000000. 0000zzzzz zzzzxxxxx


Thanks,
Dhaval
Post Reply