Split one row to multiple rows

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

venkatesan
Premium Member
Premium Member
Posts: 10
Joined: Sat Dec 25, 2010 11:35 pm
Location: Bangalore
Contact:

Split one row to multiple rows

Post by venkatesan »

Hi,

I am calling the routine in the transformer the output of the routine is
a:12,b:123,c:12345,d:9 ..........

The Job design is : Source Table --> Transformer --> Hash File

Finally once the job is successfully run the out put of the hash file is

ColumnA , Column B

JOBNAME a:12,b:123,c:12345,d:9 ..........


Now i want to chage the output of the hash file to

COLUMNA , COLUMN B

JOBNAME , a:12
JOBNAME , b:123
JOBNAME , c:12345
JOBNAME , d:9

Can you guys please let me know how to solve this issue.
samyamkrishna
Premium Member
Premium Member
Posts: 258
Joined: Tue Jul 04, 2006 10:35 pm
Location: Toronto

Re: Split one row to multiple rows

Post by samyamkrishna »

You can use a pivot stage to this.

in the derivation tab give Col1,Col2,Col3.
svhari76
Participant
Posts: 48
Joined: Tue Nov 16, 2010 5:04 pm

Post by svhari76 »

You Can use soem thing like
Job1:
Source Table-->Tranformer-->Sequencialfile(not hash file)

Job2:
Sequential file -->PIvot stage -->hashfile

You can even combine Job1 and 2 in a single job.

In you sequential file if you have something like a:1,b:12,c:123 ...
and if you know the max number of fields you can use
col1,col2,col3,..col20 for the derivation in pivot stage.

I am not sure how this works for indefinite number of columns.
Hari
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Convert the commas with @VM character and write it to a hashed file. While reading from the same hashed file, normalize on the second column.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
venkatesan
Premium Member
Premium Member
Posts: 10
Joined: Sat Dec 25, 2010 11:35 pm
Location: Bangalore
Contact:

Post by venkatesan »

Thanks Dsguru2B,

I had converted the commas using @VM Character It is giving total number of records which are split by commas. But, can you please throw a light on how to do normalize on second column
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

When you go into the Output tab/Columns tab of the hashed file you will see a column called 'Type', double click it, from the drop down chose 'MV'. The 'Normalize On' option, previously greyed out, should now be availbe. Choose your column that you set 'MV' on from that drop down as well. Hit view data. Your data should now be normalized.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

FYI... you need to use an 'Account-based' hashed file to have this option, from what I recall.
-craig

"You can never have too many knives" -- Logan Nine Fingers
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Externally pathed hashed file works too.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

As long as you create a VOC record for it or even without one?
-craig

"You can never have too many knives" -- Logan Nine Fingers
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Even without one.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Even without one.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

So... even without one?
-craig

"You can never have too many knives" -- Logan Nine Fingers
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

:) Thats funny.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
venkatesan
Premium Member
Premium Member
Posts: 10
Joined: Sat Dec 25, 2010 11:35 pm
Location: Bangalore
Contact:

Post by venkatesan »

DSguru2B wrote:When you go into the Output tab/Columns tab of the hashed file you will see a column called 'Type', double click it, from the drop down chose 'MV'. The 'Normalize On' option, previously greyed out, should now be availbe. Choose your column that you set 'MV' on from that drop down as well. Hit view data. Your data should now be normalized.
Thanks a lot.. It worked :D
venkatesan
Premium Member
Premium Member
Posts: 10
Joined: Sat Dec 25, 2010 11:35 pm
Location: Bangalore
Contact:

Post by venkatesan »

How can i implement the same logic if the design consists Universe table --> Transformer -->Target Table .

Here How to apply normalize on target table
Post Reply