I tried with both the options -- UNIX and UV...
Both are not working.
Search found 25 matches
- Wed Jan 07, 2009 7:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sub Routine
- Replies: 4
- Views: 1774
- Wed Jan 07, 2009 7:44 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sub Routine
- Replies: 4
- Views: 1774
Sub Routine
Hi All, I have a sub routine which retrieves data from a table and inserts into another table along with other job statistics info. When i used this as a after subroutine in a job, it executes successfully, but the data is not getting inserted. the routine used is : $INCLUDE DSINCLUDE JOBCONTROL.H E...
- Tue Jan 06, 2009 12:07 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: release job locks from datastage adminstrator
- Replies: 6
- Views: 11282
unlocking jobs from datastage administrator
To get the list of jobs in use... select the project and select command and type :
LIST.READU
This gets the Inode of the process I want to clear.
Then unlock the job using the Code:
UNLOCK INODE #node_number# ALL
LIST.READU
This gets the Inode of the process I want to clear.
Then unlock the job using the Code:
UNLOCK INODE #node_number# ALL
- Mon Dec 01, 2008 6:29 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Hash File error
- Replies: 2
- Views: 947
Hash File error
Hi All,
Iam trying to write data to an hash file.It sucessfully writes data into it but when i try to view data its throwing error
Thanks
Harish
Iam trying to write data to an hash file.It sucessfully writes data into it but when i try to view data its throwing error
can any one help me to resolve this issue.Warning: Data that exceeds the maximum display length has been truncated.
Thanks
Harish
- Mon Feb 04, 2008 3:59 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: CONVERSION
- Replies: 6
- Views: 2223
- Mon Feb 04, 2008 2:57 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Bulk loading of data from source to target
- Replies: 2
- Views: 831
Bulk loading of data from source to target
I have the source data.I have to load the complete data from source to target using the bulk load(manual).Everytime when we load, the previous data should be delete and then enter the complete new data.Using bulk load manual strategy I have to delete and insert the data into oracle table. For this I...
- Thu Jan 31, 2008 12:30 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Bulk loading of data from source to target
- Replies: 3
- Views: 1198
Bulk loading of data from source to target
I have the source data.I have to load the complete data from source to target using the bulk load(manual).Everytime when we load, the previous data should be delete and then enter the complete new data.Using bulk load manual strategy I have to delete and insert the data into oracle table. For this I...
- Tue Jan 29, 2008 7:43 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: bulk loading
- Replies: 5
- Views: 1765
- Tue Jan 29, 2008 7:03 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: bulk loading
- Replies: 5
- Views: 1765
bulk loading
Which bulk loading is better in performance
1. Oracle 7 Load
2. Oracle OCI Load
I have to every time delete the table and insert the records. In #2 we don't have the option to delete the table in oracle.
1. Oracle 7 Load
2. Oracle OCI Load
I have to every time delete the table and insert the records. In #2 we don't have the option to delete the table in oracle.
- Tue Dec 18, 2007 6:43 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Performance upgradation needed in datastage 7.X
- Replies: 4
- Views: 1354
Performance upgradation needed in datastage 7.X
We are updating 8lakh records based on 4 keys. we are also using a lookup and then updating into the target. Sofar the records are getting updated at a rate of 13 rows/sec. In order to increase the performance we have created an index with those 4 keys. yet the performance is too slow (13 rows/sec)....