Dumping Hash File
Moderators: chulett, rschirm, roy
Dumping Hash File
Is it possible to dump the content of a hash file to a sequential file using scripts (DS Basic or UNIX shell script)? What I want to do is sth like a database export of table. Many thanks
Sure. I can't supply something already written, but it can certainly be done in DS Basic. Very high level:
Use OPEN and OPENSEQ for the hash and flat file, respectively. Loop thru the hash with READNEXT and write to the flat file using WRITESEQ. Also be aware of WEOFSEQ and CLOSESEQ. If you don't have hard-copy manuals, there should be pdf versions of all manuals (including the BASIC manual) that installed with the client software. Also check the online help. [:)]
-craig
Use OPEN and OPENSEQ for the hash and flat file, respectively. Loop thru the hash with READNEXT and write to the flat file using WRITESEQ. Also be aware of WEOFSEQ and CLOSESEQ. If you don't have hard-copy manuals, there should be pdf versions of all manuals (including the BASIC manual) that installed with the client software. Also check the online help. [:)]
-craig
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Thank you both of you.
What we are trying to do is to create some backup points of the data staging area in between our programs so that we can re-reun the programs from those backup points in case of faliure.
The idea is like we have hundreds of jobs all linking by 2 job sequencers, the 2 sequencers would be executed in sequence and in between the 2 sequencers we would like to insert a job to backup all the intermediate hashes and sequential files so that in case of faliure of part 2 sequencer, we can restore the backup files and restart part 2 instead of the whole ETL process.
Any idea of how it can be achived?
Many thanks
What we are trying to do is to create some backup points of the data staging area in between our programs so that we can re-reun the programs from those backup points in case of faliure.
The idea is like we have hundreds of jobs all linking by 2 job sequencers, the 2 sequencers would be executed in sequence and in between the 2 sequencers we would like to insert a job to backup all the intermediate hashes and sequential files so that in case of faliure of part 2 sequencer, we can restore the backup files and restart part 2 instead of the whole ETL process.
Any idea of how it can be achived?
Many thanks
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Our concern is on the performance. Backing up with scripts (UNIX or UV) may generate less overhead than by a DS job, can I say so?
Another concern is the effort to develop hundreds of jobs which perform the same task. i.e. to copy from hash to sequential file or vice verse, with different table definition.
Please advices. Thanks a lot.
Another concern is the effort to develop hundreds of jobs which perform the same task. i.e. to copy from hash to sequential file or vice verse, with different table definition.
Please advices. Thanks a lot.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: