I have an urgent requirement where i have to split a record based on the value present in one column called 'Quantity'.
e.g
Input
---------
col1 Quantity Price
---- -------- -----
1 2 p1
2 5 p2
Output
--------
col1 Quantity Price
---- -------- -----
1 1 p1/2
1 1 p1/2
2 1 p2/5
2 1 p2/5
2 1 p2/5
2 1 p2/5
2 1 p2/5
What is happening here , is if there is a product sold 2 times , we need to insert two entry of that product in the target and the price value of each entry would be Price/Quantity .
I don't know how to accomplish this task in Datastage , or through SQl Query .
If anybody having any idea , kindly help .
Splitting 1 record into multiple record based on Quantity
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
This is an all-volunteer site. So words like "urgent", "asap", "immediate" etc do not have any effect.
All threads posted are urgent to the respective individuals.
If you want immediate response, you can sign-up for a 24x7 support from your official support provider.
Btw, to achieve what you want, you can use 'level' functionality in Oracle or write a buildOp in DataStage.
If the qty will be less than 10, you can include that many output links and funnel them together.
You can also use a dummy join to a row gen to generate the required number of rows.
All threads posted are urgent to the respective individuals.
If you want immediate response, you can sign-up for a 24x7 support from your official support provider.
Btw, to achieve what you want, you can use 'level' functionality in Oracle or write a buildOp in DataStage.
If the qty will be less than 10, you can include that many output links and funnel them together.
You can also use a dummy join to a row gen to generate the required number of rows.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
A fork-join approach would work too.
Split your stream into two. One side runs through an Aggregator to count the rows per key. The two streams are then combined using a Join stage to effect an inner join.
Split your stream into two. One side runs through an Aggregator to count the rows per key. The two streams are then combined using a Join stage to effect an inner join.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.