Schema file error
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 288
- Joined: Tue May 27, 2008 3:42 am
- Location: Luxembourg
Schema file error
Hi all
I've found that quite a few people here have encountered the problem I'm about to describe but unfortunately there's been no comprehensive solution as yet.
I have a schema file which is like:
record {delim=';',final_delim= none } (
COETF:nullable string[max=3] { null_field=' '};
MTDVI5:nullable decimal[18,2] { default=0};
MTDVI6:nullable decimal[18,2] { default=0};
CSIGN:nullable string[max=1] { null_field=' '};
)
Now my 2 decimal fields are the ones that are causing problems. Initially I didn't have the default clause in them but just had null_field=''. On reading recommendations here I decided to include this clause. However although I have valid decimals in my input file, my job refuses to read them and puts a default 0 for these fields.
What's puzzling is when I didn't have the default clause in my schema file I got a warning in the Director as follows:
Field has import error and no default value; data: {2 4 8 5 5 5 . 0 0}, at offset: 65
I must state here that my data is 248555.00 so its clear that spaces are being added in between digits. and 248555.00 is thus becoming 2 4 8 5 5 5 . 0 0
I'm pretty sure that is why when I put the default clause the field is being interpreted as invalid and thus the default 0 is being inserted.
Can you guys please give me some ideas as to why spaces are being added?
Any other solutions / ideas are definitely welcome
Cheers
Tony
I've found that quite a few people here have encountered the problem I'm about to describe but unfortunately there's been no comprehensive solution as yet.
I have a schema file which is like:
record {delim=';',final_delim= none } (
COETF:nullable string[max=3] { null_field=' '};
MTDVI5:nullable decimal[18,2] { default=0};
MTDVI6:nullable decimal[18,2] { default=0};
CSIGN:nullable string[max=1] { null_field=' '};
)
Now my 2 decimal fields are the ones that are causing problems. Initially I didn't have the default clause in them but just had null_field=''. On reading recommendations here I decided to include this clause. However although I have valid decimals in my input file, my job refuses to read them and puts a default 0 for these fields.
What's puzzling is when I didn't have the default clause in my schema file I got a warning in the Director as follows:
Field has import error and no default value; data: {2 4 8 5 5 5 . 0 0}, at offset: 65
I must state here that my data is 248555.00 so its clear that spaces are being added in between digits. and 248555.00 is thus becoming 2 4 8 5 5 5 . 0 0
I'm pretty sure that is why when I put the default clause the field is being interpreted as invalid and thus the default 0 is being inserted.
Can you guys please give me some ideas as to why spaces are being added?
Any other solutions / ideas are definitely welcome
Cheers
Tony
Tony
BI Consultant - Datastage
BI Consultant - Datastage
Re: Schema file error
Just wanted to point out that no spaces are being added to the data, that's just how that particular message represents the data, i.e. with all of the characters nicely spread out like that. I guess someone thought that would be a swell diagnostic aid or something.TonyInFrance wrote:I must state here that my data is 248555.00 so its clear that spaces are being added in between digits. and 248555.00 is thus becoming 2 4 8 5 5 5 . 0 0
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
That nullable VarChar(1) at the end looks odd to me. Is it possible that it could be interfering with reading the prior decimal fields?
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson
Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson
Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
-
- Premium Member
- Posts: 288
- Joined: Tue May 27, 2008 3:42 am
- Location: Luxembourg
Craig... Are you sure about that?
Franklin - I sure hope that the nullable VarChar(1) isn't the issue because I need that there. It represents the sign '+' or '-' of the monetary fields. Furthermore that data is being parsed without any problem. Its only the 2 amounts that are being interpreted as invalid.
Ray - Do you have any ideas?
Cheers guys
Tony
Franklin - I sure hope that the nullable VarChar(1) isn't the issue because I need that there. It represents the sign '+' or '-' of the monetary fields. Furthermore that data is being parsed without any problem. Its only the 2 amounts that are being interpreted as invalid.
Ray - Do you have any ideas?
Cheers guys
Tony
Tony
BI Consultant - Datastage
BI Consultant - Datastage
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
The spacing of characters in output is deliberate, so that non-printing characters can be indicated by code values, for example an embedded tab would be shown as {P A T T Y 09 C A K E} (so Craig is correct).
Another idea I have is a liking for my sleep. You posted at 3:56am my time.
You can, of course, read your decimal fields as VarChar (so that null_field="" will work) and convert them to decimal downstream of the Sequential File stage.
To see why, write to a text file using this schema file. Then inspect the text file using a tool outside of DataStage, such as the cat command.
Another idea I have is a liking for my sleep. You posted at 3:56am my time.
You can, of course, read your decimal fields as VarChar (so that null_field="" will work) and convert them to decimal downstream of the Sequential File stage.
To see why, write to a text file using this schema file. Then inspect the text file using a tool outside of DataStage, such as the cat command.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 288
- Joined: Tue May 27, 2008 3:42 am
- Location: Luxembourg
Sorry to have woken you up Ray...
Thanks for the confirmation about the spaces.
As for reading my decimal field as VarChar I wouldn't prefer doing so since I need to reject the entire record when any field doesn't conform to the the requisite datatype. Of course if I read it as a varchar then anything will pass through.
Are you saying that there is no way to declare the field as decimal in a schema file?
Thanks for the confirmation about the spaces.
As for reading my decimal field as VarChar I wouldn't prefer doing so since I need to reject the entire record when any field doesn't conform to the the requisite datatype. Of course if I read it as a varchar then anything will pass through.
Are you saying that there is no way to declare the field as decimal in a schema file?
Tony
BI Consultant - Datastage
BI Consultant - Datastage
Look into the "zoned" option of the decimal datatype...you can find it's description in the IS Information Center.
Regards,
Regards,
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
You can declare it, but the field better contain a valid decimal value. See Chapter 2 of Parallel Job Developer's Guide to learn how values are represented, or the reference that James provided.
Given that you are in France, is the decimal placeholder defined as ","? That might also invalidate your data, which uses "." as the decimal placeholder.
Check the project properties; you can override at job or column level.
Given that you are in France, is the decimal placeholder defined as ","? That might also invalidate your data, which uses "." as the decimal placeholder.
Check the project properties; you can override at job or column level.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Premium Member
- Posts: 288
- Joined: Tue May 27, 2008 3:42 am
- Location: Luxembourg
-
- Premium Member
- Posts: 288
- Joined: Tue May 27, 2008 3:42 am
- Location: Luxembourg
RESOLVED
Guys...
Thanks a lot for your advice. Finally it was a decimal . and , confusion which has been rectified...
......................but I have a new problem regarding schema files which I'll explain in a new thread since the problem concerns a DATE
Thanks a lot for your advice. Finally it was a decimal . and , confusion which has been rectified...
......................but I have a new problem regarding schema files which I'll explain in a new thread since the problem concerns a DATE
Tony
BI Consultant - Datastage
BI Consultant - Datastage
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 94
- Joined: Wed May 08, 2002 8:44 am
- Location: Germany
- Contact:
Hello Klaus,
Syntax you provided dint work, its failing....but here is what tony might meant...this worked too...
record { final_delim=end , delim=',', null_field='', quote=DOUBLE, charset='ISO8859-1' } (
field1:nullable string[max=50];
field2:nullable string[max=20];
field3:nullable string[max=50];
field4:nullable string[max=6];
field5:nullable decimal[11,0] {decimal_separator='.'};
field6:nullable decimal[11,0] {decimal_separator='.'};
field7:nullable decimal[11,0] {decimal_separator='.'};
field8:nullable decimal[21,6] {decimal_separator='.'};
field9:nullable decimal[21,6] {decimal_separator='.'}
)
Syntax you provided dint work, its failing....but here is what tony might meant...this worked too...
record { final_delim=end , delim=',', null_field='', quote=DOUBLE, charset='ISO8859-1' } (
field1:nullable string[max=50];
field2:nullable string[max=20];
field3:nullable string[max=50];
field4:nullable string[max=6];
field5:nullable decimal[11,0] {decimal_separator='.'};
field6:nullable decimal[11,0] {decimal_separator='.'};
field7:nullable decimal[11,0] {decimal_separator='.'};
field8:nullable decimal[21,6] {decimal_separator='.'};
field9:nullable decimal[21,6] {decimal_separator='.'}
)