Page 1 of 2

Unable to write to a sequential file

Posted: Wed Feb 22, 2012 8:22 am
by hiral.chauhan
Hello everyone,
I tried searching for a post/topic in this forum similar to the problem I have but did not have a lot of luck. Please forgive me if I have missed anything :)

Ok so, I have a job that reads data from a sequential file. It then does some lookups and transformations and then writes to another sequential file. I have an ExecSH command that strips out nulls from the generated output file (see below):

tr -d '\000' < /myprod_targetdirectory/xfilename.txt > /myprod_targetdirectory/filename.txt | rm /myprod_targetdirectory/xfilename.txt

For some reason, the first two times it ran in production the file did not get created giving me the following message. But the file was created the third time (i.e. today) it ran.

"Executed command: tr -d '\000' < /myprod_targetdirectory/xfilename.txt > /myprod_targetdirectory/filename.txt | rm /myprod_targetdirectory/xfilename.txt
*** Output from command was: ***
SH: /myprod_targetdirectory/xfilename.txt: No such file or directory"

The job completed successfully without any error. The message above is NOT a warning. There are two more jobs that execute the same kind of ExecSH command but for a different file, their target directory is the same so I rule out the permission issue here but may be I am wrong.

I can work around the issue by making the job check if the file was created and re-run it until it is because the file got created the third time. But I need to understand the issue why the file was not created the first time.

Dear experts, I humbly request you to help me find out what could be the reason why the job did not create the file. and what should be done when something like this happens. Should I just remove the ExecSH and handle the nulls in my job and see if rm command had to do anything with it? or Should I check if the job creates the file and re-run it?

I appreciate your help in this!

Thanks,
Hiral

Posted: Wed Feb 22, 2012 8:34 am
by chulett
Haven't given this much thought but first thing I noticed was the fact that you are piping the two commands together. Replace the pipe with && for a proper conditional break between the two.

Posted: Wed Feb 22, 2012 8:49 am
by hiral.chauhan
Thanks Craig. let me implement this and check.

Actually I have trouble re-creating the scenario.. it almost always creates the file in Development/QA environment. (Wonder why things behave funny in Production :( )

Posted: Wed Feb 22, 2012 10:28 am
by pandeesh
first check your unix command whether it's working as expected.

Posted: Wed Feb 22, 2012 6:17 pm
by qt_ky

Code: Select all

echo "Some OSs will let you separate commands by semicolon also" ; echo "What's the plural of OS, anyway?"

Posted: Wed Feb 22, 2012 6:20 pm
by chulett
I believe that all will. I specifically mentioned && as it is a conditional separator, meaning the second command will only execute if the first 'succeeds'. With a semi-colon the second one executes regardless of the fate of the first. Use whichever is appropriate.

OSes. :wink:

Posted: Wed Feb 22, 2012 6:26 pm
by qt_ky
Got it... thanks.

Posted: Wed Feb 22, 2012 7:15 pm
by ray.wurlod
There's also || which executes the downstream command only if the upstream command fails.

OSs (there's no second e in Systems except in French)

Posted: Tue Feb 28, 2012 1:36 pm
by hiral.chauhan
Hello Everyone,

I implemented your advice of replacing the pipe character with "&&" and it is working like a charm.
But my problem is that the business wants me to handle [NUL] or 0x000 without using the unix command.

Datastage by default pads 0x0 when a sequential file is generated. I am trying to remove this padding. So far I have taken routes like:

1. Used APT_STRING_PADCHAR = 0x20 : But failed to remove spaces later on using Trim function since my target is Char.
2. Used APT_STRING_PADCHAR = 0x20 : Used Convert() function, Convert("Char(20)","",Myfieldname) to remove spaces, but failed again.

I am not sure what I am doing wrong or may be I am completely on the wrong track here. It would be a great help if you can throw some light or advice on how to get this issue resolved...

I am running out of ideas now...

Posted: Tue Feb 28, 2012 2:40 pm
by ray.wurlod
0x20 is 32 in decimal. You should have used Char(32) rather than Char(20), but " " is actually more efficient. None of these will handle ASCII NUL, however, which is Char(0) - and is not the same as NULL (unknown value).

Code: Select all

Convert(" ", "", NullToEmpty(MyFieldValue))

Posted: Tue Feb 28, 2012 2:46 pm
by Mike
hiral.chauhan wrote:my target is Char
Char is a fixed-width data type. Your attempt to use Trim and Convert is fruitless since the result will just be padded back to the required width.

If you don't want the pad characters, then use the Varchar data type.

Mike

Posted: Wed Feb 29, 2012 3:37 pm
by hiral.chauhan
Hi Ray, I implemented

Code: Select all

Convert(" ", "", NullToEmpty(MyFieldName))
after making APT_STRING_PADCHAR = 0x20 but I still see spaces padded in the file. :(
Am I doing something wrong..

I also tried - Convert(Char(32), "", MyFieldName) but still no luck...

Hi Mike, I used VarChar instead of Char but it is still padding spaces(or nulls depending on the pad char env variable) between the columns...

for example, below is my required output

00010562012021600001755963F0000333

but it notepad++ it shows me the below output when I specify apt_string_padchar = 0x20.. it shows me [NUL] when apt_string_padchar is default.

0001056 20120216 00001755963 F 0000333

I cannot see the spaces or [NUL] in a regular notepad...

Posted: Wed Feb 29, 2012 10:10 pm
by ray.wurlod
Have you defined " " as the field delimiter on the Format properties of the sequential file stage?

Posted: Thu Mar 01, 2012 9:47 am
by hiral.chauhan
Hi Ray,

No I have not defined " " as the field delimiter on the Format properties.

Should I?

Thanks,
Hiral

Posted: Thu Mar 01, 2012 10:10 am
by Mike
No... Ray asked that because your output looks as if that could've been the case.

Are you by chance moving a Decimal to a Varchar?

Mike