Multiple Links to Same Sequential File
Moderators: chulett, rschirm, roy
Multiple Links to Same Sequential File
I am trying to write to a sequential file with multiple links without success. Using DataStage 6.0 on AIX.
I have a Transformer stage with two output links to the same sequential file. If I have 15 rows on the input link, I would expect 15 rows for each output link. The objective is to get 30 rows in the sequential file.
The actual result is that only 15 rows show up in the sequential file. The log in Director shows that 15 records went in, and 15 records went out of each output link. But it is as though one link is "clobbering" the other link.
Any ideas?
I have a Transformer stage with two output links to the same sequential file. If I have 15 rows on the input link, I would expect 15 rows for each output link. The objective is to get 30 rows in the sequential file.
The actual result is that only 15 rows show up in the sequential file. The log in Director shows that 15 records went in, and 15 records went out of each output link. But it is as though one link is "clobbering" the other link.
Any ideas?
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I have a similar problem in that I have the single input file and for every row on this input file I need to write 6 rows to an output file. These output rows are the same total length, but with different record layouts.
I have played around with various possibilities, none of which work.
Any assistance would be greatly appreciated. I have sent this off to the Ascential Tech Support who are looking at it for me and will post back any resposnes that I receive - and then I started looking for appropriate forums....
I have played around with various possibilities, none of which work.
Any assistance would be greatly appreciated. I have sent this off to the Ascential Tech Support who are looking at it for me and will post back any resposnes that I receive - and then I started looking for appropriate forums....
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
You could design a job with one input to the transformer and 6 outputs to 6 different text files. You would then run an after job routine that concatenated the 6 files into one. This will mix up the record order a bit.
Vincent McBurney
Data Integration Services
www.intramatix.com
Vincent McBurney
Data Integration Services
www.intramatix.com
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
This is an unusual situation since Johnno is is trying to write out records to the same file that have different record layouts. I don't know why he'd be trying to do this but if the records are different it makes it hard to push it into a hash file.
Vincent McBurney
Data Integration Services
www.intramatix.com
Vincent McBurney
Data Integration Services
www.intramatix.com
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Difficult, but not impossible! [:)]
If the hashed file is defined as having the key and one field, one can stick anything at all into that field, even if it contains field marks (@FM) so that the hashed file ends up having N fields per row, where N may vary between rows. It's not Codd, folks!
Ray Wurlod
Education and Consulting Services
ABN 57 092 448 518
If the hashed file is defined as having the key and one field, one can stick anything at all into that field, even if it contains field marks (@FM) so that the hashed file ends up having N fields per row, where N may vary between rows. It's not Codd, folks!
Ray Wurlod
Education and Consulting Services
ABN 57 092 448 518
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Codd, patron saint (as it were) of normal forms.
Thou shall have no repeating groups, no partial dependencies on key, no dependencies between attributes, so help you Codd. Also, every row must have identical structure.
Doesn't have to apply in hashed files, which is one of the reasons that hashed files are used to store objects in the DataStage repository. How many different kind of object are there in the DS_JOBOBJECTS hashed file, for example? (Answer: lots!!)
Thou shall have no repeating groups, no partial dependencies on key, no dependencies between attributes, so help you Codd. Also, every row must have identical structure.
Doesn't have to apply in hashed files, which is one of the reasons that hashed files are used to store objects in the DataStage repository. How many different kind of object are there in the DS_JOBOBJECTS hashed file, for example? (Answer: lots!!)
Thank you all for your help, and I'm sure over time I will try out all of the suggestions given.
The Ascential Tech Support offered: "...re-engineer the job, this time making use of the Link Partitioner and Link Collector in order to get the desired results.".
But after playing around, the simplest solution I found was to write each required output row to an output column and then use the Pivot stage to flip these around to rows.
Hope this may help someone out in the future.
Cheers
John
The Ascential Tech Support offered: "...re-engineer the job, this time making use of the Link Partitioner and Link Collector in order to get the desired results.".
But after playing around, the simplest solution I found was to write each required output row to an output column and then use the Pivot stage to flip these around to rows.
Hope this may help someone out in the future.
Cheers
John