Page 1 of 1

Binary Zeros

Posted: Wed Aug 04, 2004 5:17 am
by fridge
I want to read a binary zero out to a flat file and am having problems.

Does anyone have any information on how to approach this?

Thanks.

Posted: Wed Aug 04, 2004 5:53 am
by mleroux
Use some stages and links.

Posted: Wed Aug 04, 2004 7:20 am
by fridge
Suprisingly enough I have tried this! I have a field that when it is a particular value I need to set it to a binary zero and pass it to a flat feed. In DS it appears that @NULL is not a binary zero and when I try to use MY/MB type functions to achieve a binary zero it does not work.

Posted: Wed Aug 04, 2004 7:41 am
by mleroux
OK, good! I really don't do well without specifics. :wink: I tried the following:

Code: Select all

Oconv("0", "MB0C")
And it seemed to work fine. The output for zero in binary (ASCII encoding) was 00110000, which is correct (try Alt+048, it'll give you a 0).

Posted: Wed Aug 04, 2004 7:57 am
by fridge
Thanks for the response - your solution returns an ascii '0' whereas I need to be able to return not 0011 0000 but 0000 0000!

Any ideas?

Posted: Wed Aug 04, 2004 11:12 am
by ogmios
To my knowledge it's impossible with DataStage... It's a lack in 99% of all applications on UNIX. NUL or 0x00 is the end of a string in the C library and most application that use standard string functionality will cut off a string at that.

What reason do you have to write NUL to a file?

Ogmios.

Posted: Wed Aug 04, 2004 4:25 pm
by ketfos
Hi,
If the idea is to write null value, use datastage system variable @null.

Ketfos

Posted: Thu Aug 05, 2004 2:15 am
by fridge
Thanks for the help Ogmios/Ketfos.

The reason why I need to write a binary zero to a file is because I was intersted in converting a value to pack decimal. My code will work for any value unless one or more bytes contains a binary zero/null.

So for example the value 6476 packed returns the correct Acii chars dv with 64 the hex value of ascii d and 76 the hex of ascii v but the conversion of the value 4000 grinds to a halt. The '40' byte converts okay to ascii @ but every time I hit a null byte (00 in hex) the process fails.