issue with extral line feeds

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
hobocamp
Premium Member
Premium Member
Posts: 98
Joined: Thu Aug 31, 2006 10:04 am

issue with extral line feeds

Post by hobocamp »

I've seen some replies to problems similar to this, but haven't yet been quite able to make anything work.

We have a sequential file coming from a vendor in which a particular column occasionally contains a LF character (X'0A'), even though the end of the row has not actually occurred. Of course that causes my DS job to consider the data as two rows.

I've seen suggestions of using the EReplace and Convert functions to replace it with a space, but so far nothing I've tried will get rid of it. Due to the nature of the data I am always able to recognize the location of the extra LF, but I haven't come up with a way that actually seems to replace it.

Thanks for any suggestions.

Tom
major
Premium Member
Premium Member
Posts: 167
Joined: Mon Nov 26, 2007 12:21 am

Post by major »

hi,

Ereplace and Convert functions should work.
Can you post the syntax here and check if there are any other columns wiht extra LF.

Thanks
Major
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Scroll to the right in the Columns grid in your Sequential File stage and set the "Contains Terminators" property for the field(s) in question.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
hobocamp
Premium Member
Premium Member
Posts: 98
Joined: Thu Aug 31, 2006 10:04 am

Post by hobocamp »

Thanks for the suggestions so far.

I think I'm getting closer, but in effect, my ultimate goal is to join two rows that have been artificially broken by the presence of the wayward LF.

The file is delimited, so I use the Count function to identify the row with the extra LF. For those rows, my syntax is Convert(FromSrc.FullRow, char(10), " ").

My thought was that if I removed the LF from the input rows in question, the output target would then contain the broken records joined back together. Does it sound like I'm on the right track? I'm getting the feeling that I may be approaching this incorrectly.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

You'll first need to implement what Ray noted simply to be able to read them as a single record. After that, you can do whatever is needed to remove the LFs if you still feel the need, including the Convert() that you posted.
-craig

"You can never have too many knives" -- Logan Nine Fingers
hobocamp
Premium Member
Premium Member
Posts: 98
Joined: Thu Aug 31, 2006 10:04 am

Post by hobocamp »

Thanks Craig. I will give Ray's suggestion a try. I was attempting to read each row as one long string, thinking that might be a better way to deal with the extra LFs. I'll use the actual file definition with the "contains terminators" option set to 'Yes'. I'll update with my results.

Thanks again.
Post Reply