Hi,
I have to extract more than 400 millions of records from an Oracle table to sequential file(s) and FTP the file(s) to Mainframe.
Can someone suggest me what is the most feasible solution.
Thanks
Debasis
Extraction of huge number of records to sequential file
Moderators: chulett, rschirm, roy
Best way would be to extract it to multiple files, then passing a cat command as an after job subroutine to create a single file and then ftp it. The cat and ftp can be coupled in a simple few liner shell script as well.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Hi,
Can you tell me what is the max size limit to create a sequential file thru datastage?? Is there any way we can increase the sequential file size limit.
Becasus if I am going to extract the records from the table based upon partition I assume each file will be more than 3 GB. Is it feasible thru datastage to create the file with 3GB?
Can you tell me what is the max size limit to create a sequential file thru datastage?? Is there any way we can increase the sequential file size limit.
Becasus if I am going to extract the records from the table based upon partition I assume each file will be more than 3 GB. Is it feasible thru datastage to create the file with 3GB?