Page 1 of 1

Schema file error

Posted: Fri Feb 01, 2013 10:56 am
by TonyInFrance
Hi all

I've found that quite a few people here have encountered the problem I'm about to describe but unfortunately there's been no comprehensive solution as yet.

I have a schema file which is like:

record {delim=';',final_delim= none } (
COETF:nullable string[max=3] { null_field=' '};
MTDVI5:nullable decimal[18,2] { default=0};
MTDVI6:nullable decimal[18,2] { default=0};
CSIGN:nullable string[max=1] { null_field=' '};
)

Now my 2 decimal fields are the ones that are causing problems. Initially I didn't have the default clause in them but just had null_field=''. On reading recommendations here I decided to include this clause. However although I have valid decimals in my input file, my job refuses to read them and puts a default 0 for these fields.

What's puzzling is when I didn't have the default clause in my schema file I got a warning in the Director as follows:

Field has import error and no default value; data: {2 4 8 5 5 5 . 0 0}, at offset: 65

I must state here that my data is 248555.00 so its clear that spaces are being added in between digits. and 248555.00 is thus becoming 2 4 8 5 5 5 . 0 0

I'm pretty sure that is why when I put the default clause the field is being interpreted as invalid and thus the default 0 is being inserted.

Can you guys please give me some ideas as to why spaces are being added?

Any other solutions / ideas are definitely welcome

Cheers

Tony

Re: Schema file error

Posted: Fri Feb 01, 2013 11:38 am
by chulett
TonyInFrance wrote:I must state here that my data is 248555.00 so its clear that spaces are being added in between digits. and 248555.00 is thus becoming 2 4 8 5 5 5 . 0 0
Just wanted to point out that no spaces are being added to the data, that's just how that particular message represents the data, i.e. with all of the characters nicely spread out like that. I guess someone thought that would be a swell diagnostic aid or something.

Posted: Fri Feb 01, 2013 11:43 am
by FranklinE
That nullable VarChar(1) at the end looks odd to me. Is it possible that it could be interfering with reading the prior decimal fields?

Posted: Fri Feb 01, 2013 2:52 pm
by TonyInFrance
Craig... Are you sure about that?

Franklin - I sure hope that the nullable VarChar(1) isn't the issue because I need that there. It represents the sign '+' or '-' of the monetary fields. Furthermore that data is being parsed without any problem. Its only the 2 amounts that are being interpreted as invalid.

Ray - Do you have any ideas?

Cheers guys

Tony

Posted: Fri Feb 01, 2013 3:18 pm
by ray.wurlod
The spacing of characters in output is deliberate, so that non-printing characters can be indicated by code values, for example an embedded tab would be shown as {P A T T Y 09 C A K E} (so Craig is correct).

Another idea I have is a liking for my sleep. You posted at 3:56am my time.

You can, of course, read your decimal fields as VarChar (so that null_field="" will work) and convert them to decimal downstream of the Sequential File stage.

To see why, write to a text file using this schema file. Then inspect the text file using a tool outside of DataStage, such as the cat command.

Posted: Fri Feb 01, 2013 3:34 pm
by TonyInFrance
Sorry to have woken you up Ray...:-)

Thanks for the confirmation about the spaces.

As for reading my decimal field as VarChar I wouldn't prefer doing so since I need to reject the entire record when any field doesn't conform to the the requisite datatype. Of course if I read it as a varchar then anything will pass through.

Are you saying that there is no way to declare the field as decimal in a schema file?

Posted: Fri Feb 01, 2013 4:47 pm
by jwiles
Look into the "zoned" option of the decimal datatype...you can find it's description in the IS Information Center.

Regards,

Posted: Fri Feb 01, 2013 4:50 pm
by ray.wurlod
You can declare it, but the field better contain a valid decimal value. See Chapter 2 of Parallel Job Developer's Guide to learn how values are represented, or the reference that James provided.

Given that you are in France, is the decimal placeholder defined as ","? That might also invalidate your data, which uses "." as the decimal placeholder.
Check the project properties; you can override at job or column level.

Posted: Fri Feb 01, 2013 5:29 pm
by TonyInFrance
Will check on these two leads on Monday.

Thanks Ray and jwiles...:-)

RESOLVED

Posted: Mon Feb 04, 2013 12:24 pm
by TonyInFrance
Guys...
Thanks a lot for your advice. Finally it was a decimal . and , confusion which has been rectified...

......................but I have a new problem regarding schema files which I'll explain in a new thread since the problem concerns a DATE

Posted: Mon Feb 04, 2013 3:15 pm
by ray.wurlod
Vive la France!

Posted: Mon Jul 20, 2015 1:26 pm
by koti9
Hi Tony,

May i know how you handled the decimals in Schema file for the above issue..


Thanks & Regards
Koti

Posted: Tue Jul 21, 2015 5:32 am
by Klaus Schaefer
Not sure if this is what he did, but as per my experience you have to add a clause like {text, width=15} to get it to work.

i.e.: mydecfield:decimal[15,0] {text, width=15};

Best regards
Klaus

Posted: Thu Jul 23, 2015 7:04 am
by koti9
Hello Klaus,

Syntax you provided dint work, its failing....but here is what tony might meant...this worked too...

record { final_delim=end , delim=',', null_field='', quote=DOUBLE, charset='ISO8859-1' } (
field1:nullable string[max=50];
field2:nullable string[max=20];
field3:nullable string[max=50];
field4:nullable string[max=6];
field5:nullable decimal[11,0] {decimal_separator='.'};
field6:nullable decimal[11,0] {decimal_separator='.'};
field7:nullable decimal[11,0] {decimal_separator='.'};
field8:nullable decimal[21,6] {decimal_separator='.'};
field9:nullable decimal[21,6] {decimal_separator='.'}
)