Column Import Failure

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
samyamkrishna
Premium Member
Premium Member
Posts: 258
Joined: Tue Jul 04, 2006 10:35 pm
Location: Toronto

Column Import Failure

Post by samyamkrishna »

Hi,

The column import stage is giving error: Failure during execution of operator logic.


record{final_delim=end, record_delim='\n', delim='|', null_field='', quote=double}
(

but when i change the schema file to

record{final_delim=end, record_delim='\n', delim='|', null_field='', quote=none}
(


the job runs fine and gives import warning.

After the FATAL Error.

Failure during execution of operator logic.

it also gives some error like

0: Internal Error: (len % APT_UString::sizeCharType == 0): impexp/string.C: 780
Traceback: Could not obtain stack trace; check that 'gdb' and 'sed' are installed and on your PATH


dont really know what has to be done.

My source file has double quotes in it. So i have to use quote=double in the schema file.

Let me know what has to be done.

Thanks,
Samyam
samyamkrishna
Premium Member
Premium Member
Posts: 258
Joined: Tue Jul 04, 2006 10:35 pm
Location: Toronto

Re: Column Import Failure

Post by samyamkrishna »

In column import stage the input column for import was defined as Unicode.

I removed it and now the job is working super fine.

But what has double quotes got to do with unicode which was causing this problem.

Thanks,
Samyam
Post Reply