Search found 153 matches
- Tue Apr 04, 2017 3:06 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to reject records having Extra Fields in Seq file stage
- Replies: 6
- Views: 6807
If you configure Sequential File Stage with below properties and everything else to the defaults or however you need them to your file needs, any record with more (or less) than specified number of columns will be rejected. And you can actually capture these rejected records in a reject file with er...
- Wed Sep 29, 2010 1:49 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Update SQL
- Replies: 12
- Views: 8219
Re: Update SQL
i had pasted the same query in Target stage(Update SQl in UPSERT MODE) and the job is finished the records are not updated. Perhaps you can try running the script in enterprise stage as a target with sequential mode using delete method. In same situaltion it worked for me. Use Row Generator stage a...
- Mon Feb 04, 2008 11:38 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Unable to start ORCHESTRATE Process
- Replies: 8
- Views: 16479
Re: Unable to start ORCHESTRATE Process
3 streams in the same job? that means the increased number of operators. As the number of operators increase per job, the amount of scratch memory/buffer required to allot for the job is also increased. Since each operator requires it own memory pool. Each operator shares its operation among nodes. ...
Re: Document
Documentation that comes with software pretty much covers all you need except how you organize your projects, configure file management, creating categories and jobs. "Install and Upgrade Guide" (pdf file installed as part of the software installation) has all the documentation needed to a...
- Wed Dec 26, 2007 5:20 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Regarding Environmental variable
- Replies: 8
- Views: 3702
The name of the variable is $APT_NO_JOBMON. Why would anyone want to move this environment variable? This is an environment variable pertaining to job monitoring service with which one can control the performance statistics in the director and designer. There is supposed to be a java application ca...
- Wed Dec 26, 2007 5:02 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Multiple commands in Execute command stage
- Replies: 2
- Views: 5936
Re: Multiple commands in Execute command stage
You can have a small shell script which executes those multiple unix commands. Seems like you are trying to print the file size with "grep #TargetPath#/TargetFileName# | awk '{print $5}'". Did you mean doing a listing with -l argument and then searching for the file like below? ls -l | gre...
- Thu Dec 13, 2007 2:10 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: routine inactive
- Replies: 2
- Views: 3001
Re: routine inactive
Not sure about the user hierarchy in version 8.0 but usually you have 3 types of users which are Developer, Manager and Operator. What type of user are you? In other words, does the "user name" that you are using to access DS have developer privileges? If it is set up as operator type, the...
- Thu Dec 13, 2007 1:55 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Common Module
- Replies: 3
- Views: 1942
Re: Common Module
The problem we face is that multiple sequences may call the same program simultaneously and DS doesn't like this. Does anyone have any suggestions on how we may accomplish this without creating multiple duplicate programs. It seem odd that DS doesn't handle reuseable modules to well. The special pr...
- Wed Oct 31, 2007 5:48 pm
- Forum: General
- Topic: how can i do it ,when import the table definitions
- Replies: 1
- Views: 1752
Re: how can i do it ,when import the table definitions
Do a search with key phrase "Unable to initialize plug-in".
- Wed Oct 31, 2007 5:36 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: IPC STAGE AND LINK PARTITIONER COLLECTOR
- Replies: 1
- Views: 1836
Re: IPC STAGE AND LINK PARTITIONER COLLECTOR
Check this below post out.
http://dsxchange.com/viewtopic.php?t=10 ... e0b3f57959
And also try to find Ray's white paper.
http://dsxchange.com/viewtopic.php?t=10 ... e0b3f57959
And also try to find Ray's white paper.
- Wed Oct 31, 2007 4:19 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: ODBC - Delete failed with SET option error
- Replies: 3
- Views: 2852
Is it working now?
Are you using user defined SQL?
Take a look at this post and see if it can be of any help.
http://dsxchange.com/viewtopic.php?t=11 ... e0b3f57959
Are you using user defined SQL?
Take a look at this post and see if it can be of any help.
http://dsxchange.com/viewtopic.php?t=11 ... e0b3f57959
- Wed Oct 31, 2007 3:37 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: filetransfer using server job
- Replies: 4
- Views: 2792
- Mon Oct 29, 2007 3:33 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Source Loading Stage DB2 Load or DB2 API or DSDB2
- Replies: 1
- Views: 1705
Re: Source Loading Stage DB2 Load or DB2 API or DSDB2
You can not use DB2/UDB Load Stage as a source. It is a bulk load utility provided for UDB and can only be used as a target. DB2/UDB API also known as DB2 Plug-in Stage is used for reading and writing data from or into DB2. API stage should do well for reading or writing 1 million rows. But certainl...
- Mon Oct 29, 2007 3:09 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: filetransfer using server job
- Replies: 4
- Views: 2792
Re: filetransfer using server job
FTP Plug-in Stage does the job.
I would use DS Job to do this kind of task only if other DS functionality(Transformations) is needed as part of file transfer.
But if you are looking for only file transfer on UNIX, secure copy (remote file copy program) SCP is more reliable and faster as well.
I would use DS Job to do this kind of task only if other DS functionality(Transformations) is needed as part of file transfer.
But if you are looking for only file transfer on UNIX, secure copy (remote file copy program) SCP is more reliable and faster as well.
- Fri Oct 26, 2007 3:01 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Writing data into Teradata table
- Replies: 10
- Views: 5520
It is nothing to do with Teradata. Yeah, I know. Project Level parameters do not work in all stages. Go figure. Thats not true. They do work in all the stages. Resolving Project defaults ($PROJDEF) or environment ($ENV) variables in the jobs is nothing to do with the actual Databases. As long as you...