Page 1 of 1

How to ignore lines at the beginning and ending of a file

Posted: Fri Aug 05, 2005 11:20 am
by ncsridhar
I was wondering how could we skip a pre-determined number of lines in a sequential file? We have integrity checks such as no. of rows of data and\or checksum of the file, marked up at the very end (or sometimes even in the beginning) of the file. How can we make the sequential file stage to skip a few lines and read from a given line no.?

I know that we can ask the stage to skip the first line if it has the names of the columns as the first line...


Thanks.

Re: How to ignore lines at the beginning and ending of a fil

Posted: Fri Aug 05, 2005 11:43 am
by chitra
You can do it separately in a shell script to remove the unwanted lines and then invoke the dsjob.






ncsridhar wrote:I was wondering how could we skip a pre-determined number of lines in a sequential file? We have integrity checks such as no. of rows of data and\or checksum of the file, marked up at the very end (or sometimes even in the beginning) of the file. How can we make the sequential file stage to skip a few lines and read from a given line no.?

I know that we can ask the stage to skip the first line if it has the names of the columns as the first line...


Thanks.

Posted: Fri Aug 05, 2005 11:46 am
by DaleK
Run the data through a transformer.
If the records have a record type include then only pass the data records down an output link. We do this for files that have header and trailer records.

If you don't have a record type associated with each record, I would still run the file through a transformer and the use the @INROWNUM in the constraint i.e. @INROWNUM > 5

Posted: Fri Aug 05, 2005 11:50 am
by kumar_s
Hi,
you can use @INROWNUM/ @OUTROWNUM function in the transfomer to count the number of rows passed in or out of the transformer.
You can set the constrains or condition such a way that it restricts to the confined limit of rows or specific numbers.
You also have other way to skip the later part, you can abort the flow after the number of records is reached your target. This option is also available in transformer constraints.

regards
kumar

Posted: Fri Aug 05, 2005 11:57 am
by kumar_s
DaleK wrote:Run the data through a transformer.
If the records have a record type include then only pass the data records down an output link. We do this for files that have header and trailer records.

If you don't have a record type associated with each record, I would still run the file through a transformer and the use the @INROWNUM in the constraint i.e. @INROWNUM > 5
Oh! i didnt refreshed my page,
become the repetation of Dale

regards
kumar

Posted: Fri Aug 05, 2005 12:51 pm
by ncsridhar
It's a great response guys! I haven't tried the solution yet because of some other hold up, but I am sure it will work.

Although, I was just wondering, what was the source for this type of information when you encountered this problem the first time. The help is not all that helpful often times! What's the best place to get information?


Thanks.

Posted: Fri Aug 05, 2005 1:05 pm
by kumar_s
ncsridhar wrote:It's a great response guys! I haven't tried the solution yet because of some other hold up, but I am sure it will work.

Although, I was just wondering, what was the source for this type of information when you encountered this problem the first time. The help is not all that helpful often times! What's the best place to get information?


Thanks.
For me, My Best Teacher is DSXchange.
Hats off to DSX.

regards
kumar

grab date from Report Header and put in Table

Posted: Tue Jan 17, 2006 12:33 pm
by shyamsrp
Iam trying to pull the data from a report file into Table (Oracle-9).I have header and trialer in the report file. I managed to filter header & trailer rows(based on @inrownum) from the report file and pull the data rows that I needed into Oracle table.

I have a requirement to grab the DATE from the header and put in Target_DATE colunm of Table for all the rows coming in from the file. I know the the rownum&colunm of report file which gives me the DATE.Is there any away to do this in the same Transformer where Iam filtering the Header/Trailer.

thanks in advance.
SRP


ncsridhar wrote:It's a great response guys! I haven't tried the solution yet because of some other hold up, but I am sure it will work.

Although, I was just wondering, what was the source for this type of information when you encountered this problem the first time. The help is not all that helpful often times! What's the best place to get information?


Thanks.

Posted: Tue Jan 17, 2006 2:33 pm
by kcbland
Use a stage variable to hold the value from the row that contains the date.

Posted: Tue Jan 17, 2006 3:04 pm
by shyamsrp
Thanks a lot your time and effort.

Actually,Iam throwing out the row which contains the DATE, as that is header data. and , as this is a fixed width file, I have some IDs in the same colunm that of DATE, where rownum > 10 ( which I need to bring over to DB).

Is there a way, to Pick the (COL2) WHERE ROWNUM = 4, and store the value some where and map it to DATE in the Target for all the incoming rows.(COL2) of other row contains different data.

this is report generated date which should be added to all the rows before inserting in table.

thanks a lot.

kcbland wrote:Use a stage variable to hold the value from the row that contains the date.

Posted: Tue Jan 17, 2006 3:16 pm
by kcbland
shyamsrp wrote:Iam throwing out the row which contains the DATE, as that is header data
Don't throw it out, parse it and update a stage variable. Use the stage variable in the derivation of the target column.

Posted: Tue Jan 17, 2006 3:53 pm
by shyamsrp
thanks for quick reply,kcbland.
I am loading that particular col of that row into a HASH and doing look up to bring that value into the target..its working this value is different for different source files...thanks again
kcbland wrote:
shyamsrp wrote:Iam throwing out the row which contains the DATE, as that is header data
Don't throw it out, parse it and update a stage variable. Use the stage variable in the derivation of the target column.