Search found 64 matches

by SonShe
Fri Jun 24, 2005 3:13 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Trapping job abort in batch control
Replies: 1
Views: 1020

Trapping job abort in batch control

I have several sub batches (not sequencers) and one master batch. When a job aborts the master batch still continues to run. How do I trap the abort status of the jobs in the master batch? I will appreciate your help.

Thanks.
by SonShe
Fri May 13, 2005 9:45 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Clearing hash file with CLEAR.FILE command
Replies: 6
Views: 1983

Thanks again, Sainath. I liked the idea!
by SonShe
Fri May 13, 2005 8:00 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Clearing hash file with CLEAR.FILE command
Replies: 6
Views: 1983

Sainath thanks for the reply. There are three input links writing data to the hash file. That is why I clear the file using the CLEAR.FILE command before the job start. I believe in this case I cannot check the clear box in the hash stage. Please clarify.
by SonShe
Fri May 13, 2005 7:58 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: pass values to job parameters in a sequencer
Replies: 16
Views: 6170

If you write a job which reads from Oracle and writes to either a hash file or a sequential file then these are easy to read or write to in BASIC in either a routine or a batch job. I usually start jobs from batch jobs and not routines. As to the routine returning start and end dates as one field. ...
by SonShe
Fri May 13, 2005 7:07 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Clearing hash file with CLEAR.FILE command
Replies: 6
Views: 1983

Clearing hash file with CLEAR.FILE command

I want to clear a hash file with a before-job command "CLEAR.FILE hashfilename". WHen the hash file was being created in the default project directory, this was working. However, when the hash file is being created in a separate directory other than the default project directory it looks l...
by SonShe
Thu May 05, 2005 7:27 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: pass values to job parameters in a sequencer
Replies: 16
Views: 6170

Thanks Sainath. I thought of doing that. But I am using those values as part of the names of sequential files that I build. I don't know how/where I can parse the delimited string to use in creating the file name.

Thanks.
by SonShe
Thu May 05, 2005 6:57 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: pass values to job parameters in a sequencer
Replies: 16
Views: 6170

Kim, thanks for the reply. I definitely have a few challenges here like figuring out how to access a DB table from a routine, how to write to a hash file etc. But I believe there are enough number of postings in this forum to give me the idea. However, I still can't understand how I can set the valu...
by SonShe
Thu May 05, 2005 6:17 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: pass values to job parameters in a sequencer
Replies: 16
Views: 6170

Describe how you do this in a shell script or post the code. Kim, in the shell script I select a date from a Oracle table and store that in a variable. The second value is another date value that is either current date or the date supplied by the user that is stored in a unix file. I capture this d...
by SonShe
Tue May 03, 2005 1:53 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: pass values to job parameters in a sequencer
Replies: 16
Views: 6170

pass values to job parameters in a sequencer

I have a sequencer with a few job parameters. Three of them I need to pass after determining the values in a shell script. I can do that. However, my boss wants me to not use a script for this. I don't know much basic programming. I would appreciate any ideas or code from any one to help me. Thanks.
by SonShe
Wed Apr 20, 2005 1:46 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Very Slow performance - Table to table data transfer
Replies: 2
Views: 900

Very Slow performance - Table to table data transfer

I have a job that moves data from one oracle table to another oracle tables. The datastage server and oracle server are physically different. I am getting a speed of only 100 rows per second even if it is a simple transfer of data. I have OCI -> TRANSFORMER -> OCI stages. The transaction size has be...
by SonShe
Tue Mar 29, 2005 6:18 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to run jobs in a sequencer based on input value
Replies: 4
Views: 782

Hi, the starting point should be a nested condition stage having triggers that start the job asctivities depending on the conditions you mention. IHTH, Thanks for the reply. I could develop using nested condition and control running of the independent jobs. However if I have two jobs - the first on...
by SonShe
Mon Mar 28, 2005 10:40 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to run jobs in a sequencer based on input value
Replies: 4
Views: 782

How to run jobs in a sequencer based on input value

I have a sequencer with a few jobs in it. The jobs are not dependant on each other. I would like to run only some of the jobs in the sequencer. For example, out of jobs A, B, C, and D I want only A and C to run based on the input parameter value. Can I do something like this? If so, I would apprecia...
by SonShe
Tue Feb 22, 2005 10:47 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Can't run any job
Replies: 3
Views: 1046

It may help to delete the hashed file completely and then to re-create it. The aborted process may still be holding locks. Possibly the quickest "fix" is to recycle the DataStage services (shut down and re-start). Obviously no jobs can be running when you do this, but if you can't start a...
by SonShe
Tue Feb 22, 2005 6:28 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Can't run any job
Replies: 3
Views: 1046

Can't run any job

Can any one please help me here? A few hours ago I had a long running job aborted with ds_uvput() - failed to write to hash file error message. I searched the forum to find the possible reasons for ds_uvput() message. And sure enough we filled our hash file with more 2 gig of data. Later I cleared t...
by SonShe
Sat Feb 12, 2005 4:06 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Performance while reading from hash file
Replies: 1
Views: 505

Performance while reading from hash file

I have a job that has been designed as below: Oracle Stage ---> Transformer ----> Transformer ----> Transformer ----> HashFile ------> Transformer ---> Transformer --- > Oracle Stage. These various transformers have look-ups from hash files. When I run the job, data moves from the first Oracle stage...