Hi Have a requirement in which i need to proceess the input records based onbatches of 100,000 records to the final output file.. This requirement is because to resolve performance issue..
Please suggest..
As it has 15month's old record so we extracting all the transaction records and limiting in the final output file to process 100,000 records per day
Processing in input data in batch wise
Moderators: chulett, rschirm, roy
Do you mean that you wish to process your input in batches of 100,000 or that you wish to parcel out your output to files limited to 100,000 lines each? If you create one output file you could use the UNIX command "split" to effect that.
Last edited by ArndW on Wed Jul 15, 2009 5:25 am, edited 1 time in total.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
Thanks for the quick response.ArndW wrote:Do you mean that you wish to process your input in batches of 100,000 or that you wish to parcle out your output to files limited to 100,000 lines each? If you create one output file you could use th ...
i mean i wish to process your input in batches of 100,000. Could you please help me out in achieving this task
I assume your input file is bigger than 100,000 rows. Isn't doing several runs, each processing 100,000 [different] rows to separate output files the same as doing one run and then splitting the output into chunks of 100,000? The latter would be so much easier and less susceptible to errors.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
Monaz, please - why are you starting over again on the same subject as your previous post?
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers