Page 1 of 2

How to split a single row of record in to two row of output

Posted: Mon Apr 02, 2007 6:48 pm
by Prabhakar
Hi Gurus,

I am having a problem in spliting a single row of records in to 2 output rows

eg:
Input:
cust id Name Lastname age
1 Prab raj 23
2 sam rob 24

output:
Custid Name Lastname
Custid age
1 Prab Raj
1 23
2 Sam rob
2 24


Can any one suggest me which stage to choose to do this function

Posted: Mon Apr 02, 2007 6:52 pm
by kumar_s
There are several ways of doing it. Use Transformer stage and extend two links. One with "Cust id Name Lastname" and other with "Cust id Name age".
You can use Link Collector stage to merge it or you can directly write it into tow seperate sequential file stage, one with Append option. You can sort based on Cust id if required.

Posted: Mon Apr 02, 2007 7:05 pm
by DSguru2B
Welcome Aboard :)
You can do a similar thing if your on px job except that you wont have a link collector to collect it. In that case, pass an after job subroutine ExecDOS command to concatenate the two files and sort on id
Something like

Code: Select all

cat file1 file2 | sort > file 3
I supplied unix commands because you are using px on windows which has MKS Toolkit shipped with the product.

Posted: Mon Apr 02, 2007 7:10 pm
by kumar_s
By using Append option in one file will take care of this.

Posted: Mon Apr 02, 2007 7:13 pm
by Prabhakar
but there is no collecter in data Stage what stage to choose to merge the files. sequential file can't have multiple inputs

Posted: Mon Apr 02, 2007 7:19 pm
by chulett
kumar_s wrote:By using Append option in one file will take care of this.
:? Err... two links writing to the same file at the same time, one with the append option? No can do, my friend. If you could manage that in a serial fashion where the overwrite link completes before the append link starts, only then is that an appropriate answer.

You may not be able to have 'multiple inputs' but you can write to two separate files and then combine them after job, or in another job depending on the nature of the 'combining' that needs doing.

Posted: Mon Apr 02, 2007 7:25 pm
by kumar_s
There is a stage called "Link Collector", may be the project that you are using are not customized for that stage. You can do it by Customizing from pallet. But the input for this stage should be of same metadata.
So if you are going to use this, you need to make the file as same metadata on both the link. You may need to concatinate the fields. Or you can use the other approach.

Posted: Mon Apr 02, 2007 7:28 pm
by kumar_s
Missed the Craigs point. Even seqential file should have Row level commit. :wink: Should go back to DSGuru2B's option.

Posted: Tue Apr 03, 2007 8:18 am
by vipshiva
Use pivot Stage in Designer.....


Regards,
Siva.

Posted: Tue Apr 03, 2007 8:52 am
by prnath78
How about this:

Input ->TX->(link1 with Custid Name Lastname )->| -> funnel(custid)-o/p
->(link2 with Custid age ) ->|

Thanks & Regards,
Ranga

Posted: Tue Apr 03, 2007 9:25 am
by michaeld
how about if you use a copy stage and only select the columns that you need for each stream. If you need to filter them later on then use a filter stage.

Transformer stages are kind of slow and should be avoided in high volume jobs if possible.

Posted: Tue Apr 03, 2007 10:18 am
by us1aslam1us
prnath78 wrote: Input ->TX->(link1 with Custid Name Lastname )->| -> funnel(custid)-o/p
->(link2 with Custid age ) ->|
Funnel Stage needs same metadata from both streams to work. Dsguru has given clear description, follow that post.

Posted: Tue Apr 03, 2007 10:20 am
by DSguru2B
Transformers are not slow, folks. Someone needs to get, in writing, on busting this myth.

Re: How to split a single row of record in to two row of out

Posted: Tue Apr 03, 2007 11:31 am
by prnath78
us1aslam1us wrote:Funnel Stage needs same metadata from both streams to work. Dsguru has given clear description, follow that post.
Sorry.
I agree with you.

Posted: Tue Apr 03, 2007 9:50 pm
by kumar_s
I tried the following scenario.
Two link from Transformer, both to the same sequential file stable, one as Overwrite and other as Append. Interestingly I didn't any error or warnings. And output is as expected!!!
Note : In Server Job.