Hi All,
I am extracting aprroximately 50 columns from a database to a fixed width flat file. According to my requirement I have to add a header and a trailer record. The header record contains 2 columns(type and creation date) and this should be the first row in the file. The trailer record contain 2 columns one is type 'T'and record count (including header and trailer). A carriage return should be added to the end of each record in the file.
Is there a way to add Char(10) to all the 50 columns at once for the carriage return.
Hope I will be helped with some ideas.
thanks
Adding head and tail recs
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
In the controlling job, or in a before-job subroutine, write the header record to the file.
Within the job itself use "append to file" as the rule in the target Sequential File stage.
In the controlling job, or in an after-job subroutine, write the trailer record to the file. The count can be obtained using DSGetStageInfo or DSGetLinkInfo functions.
Why do you want to add a newline character to every column?!!
Within the job itself use "append to file" as the rule in the target Sequential File stage.
In the controlling job, or in an after-job subroutine, write the trailer record to the file. The count can be obtained using DSGetStageInfo or DSGetLinkInfo functions.
Why do you want to add a newline character to every column?!!
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
Within the output sequential file stage you will see a combo box called "Line Termination" with the optoins "Unix Style (LF)" and "DOS style (CR LF)" that may fix your line termination problems. Sounds like you are having trouble reading a Unix style file on a Windows system, just switch the job across to DOS style.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Thanks for the replies.
Ray, can you please explain in detail as I am new to the basic routines how am I going to add the header record with hardcoded value 'H' and the creation date to the first line of my target flat file and leave the rest of the columns empty. The details should fill with 50 colimn values from the second row and finally trailer with hardcoded value 'T' and number of records including the header and trailer.
Coming to the carriage return, I don't know why they want it in that way but in my requirement it clear says, need a CR after every record.
Will this work if I extract 3 files from the transformer with header, detail and trailer in 3 files and using a linkcollector to collect all at the end.
thanks,
Ray, can you please explain in detail as I am new to the basic routines how am I going to add the header record with hardcoded value 'H' and the creation date to the first line of my target flat file and leave the rest of the columns empty. The details should fill with 50 colimn values from the second row and finally trailer with hardcoded value 'T' and number of records including the header and trailer.
Coming to the carriage return, I don't know why they want it in that way but in my requirement it clear says, need a CR after every record.
Will this work if I extract 3 files from the transformer with header, detail and trailer in 3 files and using a linkcollector to collect all at the end.
thanks,
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
What's the format of the target which, because of "their" requirement, I am assuming is a text file?
You don't need to write any BASIC for the header row. You could use ExecSH as the before-job subroutine, to execute a command such aswhere #FileName# is a reference to a job parameter.
The job itself extracts information from the database and appends lines to the file referred to as #FileName#. The Format tab on the Sequential File stage allows you to specify the line terminator character (though it is unusal to have line terminator characters with fixed width data). Choose "UNIX style" to append a Char(10) to each line automatically.
The job itself could also keep a row count, as a separate output stream through an Aggregator stage with Last as its aggregate rule, and writing this to another file, say File2. Perhaps generate the value using a stage variable, initialized to 2 (for the header and trailer) and incremented by 1 for every row processed.
Then, in an after-job subroutine, again use ExecSH to execute a command such as
The -n on echo suppresses generation of a line terminator; if your UNIX doesn't support the -n option, find an equivalent.
You don't need to write any BASIC for the header row. You could use ExecSH as the before-job subroutine, to execute a command such as
Code: Select all
echo H`date +"%Y%m%d"` > #FileName#
The job itself extracts information from the database and appends lines to the file referred to as #FileName#. The Format tab on the Sequential File stage allows you to specify the line terminator character (though it is unusal to have line terminator characters with fixed width data). Choose "UNIX style" to append a Char(10) to each line automatically.
The job itself could also keep a row count, as a separate output stream through an Aggregator stage with Last as its aggregate rule, and writing this to another file, say File2. Perhaps generate the value using a stage variable, initialized to 2 (for the header and trailer) and incremented by 1 for every row processed.
Then, in an after-job subroutine, again use ExecSH to execute a command such as
Code: Select all
echo -n T >> #FileName# && cat File2 >> #FileName#
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.