Schema file error

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
TonyInFrance
Premium Member
Premium Member
Posts: 288
Joined: Tue May 27, 2008 3:42 am
Location: Luxembourg

Schema file error

Post by TonyInFrance »

Hi all

I've found that quite a few people here have encountered the problem I'm about to describe but unfortunately there's been no comprehensive solution as yet.

I have a schema file which is like:

record {delim=';',final_delim= none } (
COETF:nullable string[max=3] { null_field=' '};
MTDVI5:nullable decimal[18,2] { default=0};
MTDVI6:nullable decimal[18,2] { default=0};
CSIGN:nullable string[max=1] { null_field=' '};
)

Now my 2 decimal fields are the ones that are causing problems. Initially I didn't have the default clause in them but just had null_field=''. On reading recommendations here I decided to include this clause. However although I have valid decimals in my input file, my job refuses to read them and puts a default 0 for these fields.

What's puzzling is when I didn't have the default clause in my schema file I got a warning in the Director as follows:

Field has import error and no default value; data: {2 4 8 5 5 5 . 0 0}, at offset: 65

I must state here that my data is 248555.00 so its clear that spaces are being added in between digits. and 248555.00 is thus becoming 2 4 8 5 5 5 . 0 0

I'm pretty sure that is why when I put the default clause the field is being interpreted as invalid and thus the default 0 is being inserted.

Can you guys please give me some ideas as to why spaces are being added?

Any other solutions / ideas are definitely welcome

Cheers

Tony
Tony
BI Consultant - Datastage
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Re: Schema file error

Post by chulett »

TonyInFrance wrote:I must state here that my data is 248555.00 so its clear that spaces are being added in between digits. and 248555.00 is thus becoming 2 4 8 5 5 5 . 0 0
Just wanted to point out that no spaces are being added to the data, that's just how that particular message represents the data, i.e. with all of the characters nicely spread out like that. I guess someone thought that would be a swell diagnostic aid or something.
-craig

"You can never have too many knives" -- Logan Nine Fingers
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

That nullable VarChar(1) at the end looks odd to me. Is it possible that it could be interfering with reading the prior decimal fields?
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
TonyInFrance
Premium Member
Premium Member
Posts: 288
Joined: Tue May 27, 2008 3:42 am
Location: Luxembourg

Post by TonyInFrance »

Craig... Are you sure about that?

Franklin - I sure hope that the nullable VarChar(1) isn't the issue because I need that there. It represents the sign '+' or '-' of the monetary fields. Furthermore that data is being parsed without any problem. Its only the 2 amounts that are being interpreted as invalid.

Ray - Do you have any ideas?

Cheers guys

Tony
Tony
BI Consultant - Datastage
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The spacing of characters in output is deliberate, so that non-printing characters can be indicated by code values, for example an embedded tab would be shown as {P A T T Y 09 C A K E} (so Craig is correct).

Another idea I have is a liking for my sleep. You posted at 3:56am my time.

You can, of course, read your decimal fields as VarChar (so that null_field="" will work) and convert them to decimal downstream of the Sequential File stage.

To see why, write to a text file using this schema file. Then inspect the text file using a tool outside of DataStage, such as the cat command.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
TonyInFrance
Premium Member
Premium Member
Posts: 288
Joined: Tue May 27, 2008 3:42 am
Location: Luxembourg

Post by TonyInFrance »

Sorry to have woken you up Ray...:-)

Thanks for the confirmation about the spaces.

As for reading my decimal field as VarChar I wouldn't prefer doing so since I need to reject the entire record when any field doesn't conform to the the requisite datatype. Of course if I read it as a varchar then anything will pass through.

Are you saying that there is no way to declare the field as decimal in a schema file?
Tony
BI Consultant - Datastage
jwiles
Premium Member
Premium Member
Posts: 1274
Joined: Sun Nov 14, 2004 8:50 pm
Contact:

Post by jwiles »

Look into the "zoned" option of the decimal datatype...you can find it's description in the IS Information Center.

Regards,
- james wiles


All generalizations are false, including this one - Mark Twain.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

You can declare it, but the field better contain a valid decimal value. See Chapter 2 of Parallel Job Developer's Guide to learn how values are represented, or the reference that James provided.

Given that you are in France, is the decimal placeholder defined as ","? That might also invalidate your data, which uses "." as the decimal placeholder.
Check the project properties; you can override at job or column level.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
TonyInFrance
Premium Member
Premium Member
Posts: 288
Joined: Tue May 27, 2008 3:42 am
Location: Luxembourg

Post by TonyInFrance »

Will check on these two leads on Monday.

Thanks Ray and jwiles...:-)
Tony
BI Consultant - Datastage
TonyInFrance
Premium Member
Premium Member
Posts: 288
Joined: Tue May 27, 2008 3:42 am
Location: Luxembourg

RESOLVED

Post by TonyInFrance »

Guys...
Thanks a lot for your advice. Finally it was a decimal . and , confusion which has been rectified...

......................but I have a new problem regarding schema files which I'll explain in a new thread since the problem concerns a DATE
Tony
BI Consultant - Datastage
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Vive la France!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
koti9
Participant
Posts: 52
Joined: Wed Nov 09, 2005 10:51 am

Post by koti9 »

Hi Tony,

May i know how you handled the decimals in Schema file for the above issue..


Thanks & Regards
Koti
Klaus Schaefer
Participant
Posts: 94
Joined: Wed May 08, 2002 8:44 am
Location: Germany
Contact:

Post by Klaus Schaefer »

Not sure if this is what he did, but as per my experience you have to add a clause like {text, width=15} to get it to work.

i.e.: mydecfield:decimal[15,0] {text, width=15};

Best regards
Klaus
koti9
Participant
Posts: 52
Joined: Wed Nov 09, 2005 10:51 am

Post by koti9 »

Hello Klaus,

Syntax you provided dint work, its failing....but here is what tony might meant...this worked too...

record { final_delim=end , delim=',', null_field='', quote=DOUBLE, charset='ISO8859-1' } (
field1:nullable string[max=50];
field2:nullable string[max=20];
field3:nullable string[max=50];
field4:nullable string[max=6];
field5:nullable decimal[11,0] {decimal_separator='.'};
field6:nullable decimal[11,0] {decimal_separator='.'};
field7:nullable decimal[11,0] {decimal_separator='.'};
field8:nullable decimal[21,6] {decimal_separator='.'};
field9:nullable decimal[21,6] {decimal_separator='.'}
)
Post Reply