I am processing a file that contains some fields with packed data in. For example, when the file is viewed within DS, the values look like the following: { 5A 00{ M L Whereas the values of these fields should be as follows: 0 51 000 -4 -3 Is there a way within DS of converting these packed fields in...
I have a Master Job Sequence that calls other smaller sequences within the Job. The Master Job Sequence has the "Add checkpoints so sequence is restartable on failure" tick box ticked (as do all the smaller sequences). When the job starts, 3 sequences and a cleanse job all kick off at the ...
The majority of our DS Jobs are run via a Unix Script which calls the DS Sequence and runs the jobs. Occassionally, and particularly on Load Jobs, shoud the job abort due to 50 warnings, the error is not always relayed back to the Script and as such, on our scheduling program, the job shows as compl...
I am currently working on a job where I have a list of ID's that relate to certain bank accounts. From these ID's I am attempting to obtain, via a lookup, the Customer ID's of the account holder. For sole accounts this is okay as there is a 1:1 relationship. However, for joint accounts, there will b...
I have created a hash file of 2 columns containing different ID's. The first column contains Key ID's and the second column contains the subsidiary ID's. So for example, there could be several occurences of an ID in Column 1 but all the ID's in Column 2 would be unique. Col1, Col2 1,3 1,4 1,5 2,6 2,...
I have a Sequential file that consists of a Branch Number followed by 10 occurences of an Account number (if there are not 10 valid Account numbers then the remaining ones are set to 00000000), all related to the initial Branch Number. 4001000016644800154568001422680000000000000000000000000000000000...
You might be getting a timeout error in your job; DataStage really doesn't care about the number of rows. Is your "tmp" file a named pipe or a real disk file? About how long does this job run? What are your DB/2 timeouts? Arnd It is a real disk file and the job runs for about 50 mins. The...
I am running a Job in DS using a flat file as the source file containing around 8 million rows. One of the fields is used as a Natural Key passed into a Shared Container to do a lookup and retrieve the corresponding Surrogate Key. When I run the job, the 8 million rows run through the initial transf...