sequential:50 out of 200
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 46
- Joined: Mon Sep 24, 2007 12:37 am
- Location: INDIA
sequential:50 out of 200
Hello Everybody,
I have a paticular job in that i need to send only first 50 rows of records out of 200 from the source plugin to the other plugins can any one tell a simplest way to do this...
I have a paticular job in that i need to send only first 50 rows of records out of 200 from the source plugin to the other plugins can any one tell a simplest way to do this...
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 46
- Joined: Mon Sep 24, 2007 12:37 am
- Location: INDIA
hi
Thanks For ur reply, i tried with this but am not getting correctly..can you walk through the steps..ray.wurlod wrote:If it's in sequential mode use a Filter command that invokes the head command. Otherwise use a downstream Head stage. ...
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 46
- Joined: Mon Sep 24, 2007 12:37 am
- Location: INDIA
-
- Participant
- Posts: 46
- Joined: Mon Sep 24, 2007 12:37 am
- Location: INDIA
Thanks for ur reply the partition used here is round robin.. I dont think Entire partition will work for this job...Ramani wrote:I think there should be something to do with the partition for this. Because the propery Number of rows to copy from input to output is per partition.
So what is the partition type here. Is it "Entire"
Thanks
Ramani.
-
- Premium Member
- Posts: 99
- Joined: Mon Sep 03, 2007 7:49 am
- Location: Stockholm, Sweden
If I am not misunderstanding this entirely, you want to read the 50 first rows from a sequncial file!?!
So using a sequncial file stage (in sequncial mode with one reader) and a filter with the specification head -50, will only provide you with one node containg the first 50 records of your file. If you later decide to partition this data for some processing thats fine and you will only be processing those 50 rows spread across your configuration.
You could also use external source stage with the command head -50 <filename> for the same effect (remember not to read the data using more nodes because that will probably give you 50 records per node)
So using a sequncial file stage (in sequncial mode with one reader) and a filter with the specification head -50, will only provide you with one node containg the first 50 records of your file. If you later decide to partition this data for some processing thats fine and you will only be processing those 50 rows spread across your configuration.
You could also use external source stage with the command head -50 <filename> for the same effect (remember not to read the data using more nodes because that will probably give you 50 records per node)
-------------------------------------
http://it.toolbox.com/blogs/bi-aj
my blog on delivering business intelligence using agile principles
http://it.toolbox.com/blogs/bi-aj
my blog on delivering business intelligence using agile principles