Search found 459 matches

by T42
Wed Nov 17, 2004 7:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: reading the records from thesored dataset using transformer
Replies: 1
Views: 952

One suggestion:

Run a before job routine that does a "head -200 #inpufile# > #outputfile#"

Read the output file in the main stream.
by T42
Wed Nov 17, 2004 7:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue with Funnel Stage
Replies: 2
Views: 1821

Contact Ascential Support and inform them of this problem. There has been several patches and fixes for this stage throughout EE 7.x, and you may have to upgrade to take advantage of these bottlenecks.

You can always append a target file with more data, just not at the same time.
by T42
Wed Nov 17, 2004 7:17 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error in preserve partitioning when Joiner Stage Used
Replies: 3
Views: 2136

The framework actually insert in partitioning tasks where it is necessary/set by default. Read the online help for each stage, and use $APT_DUMP_SCORE to oserve this behavior. The information shared is a bit hard to read, though.
by T42
Mon Nov 15, 2004 11:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: What is scratchdisk?
Replies: 7
Views: 3729

Is that the sound of your fingers creaking, Ray? :lol:

Bear in mind -- "scratchdisk" (as seen in your configuration file) is totally separate from "disk" -- Datasets disks, which are used exclusively for... Datasets!
by T42
Mon Nov 15, 2004 11:25 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error while executing parallel job
Replies: 9
Views: 6302

Behind the scene, when DataStage compile a transformer stage, they actually use your specified C++ compiler. /opt/SUNWspro/bin/CC -KPIC -O -I/opt/dsadm/Ascential/DataStage/PXEngine/include -dalign -O -PIC -library=iostream -c If your paths are wrong, then your compiler simply will not work, and ther...
by T42
Mon Nov 15, 2004 11:20 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Loading Data into a PX Job from Oracle
Replies: 1
Views: 1395

Re: Loading Data into a PX Job from Oracle

Oracle_Enterprise_8: Access to sys.dba_extents required but not available. Please see your dba for select privileges Do exactly what it suggested. Go to your DBA. Tell them, "HEY! Stop restricting my account so tightly, and let me have access to sys.dba_* tables. Kthxbaibai." DataStage PX...
by T42
Mon Nov 15, 2004 11:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Job parameters to BuildOP
Replies: 2
Views: 1687

Yes, you can use the job's parameters within the buildop. Think of the buildop as a function call. You need to set up the stage parameters on the buildop stage to obtain values from those job parameters. See the following pages on the Parallel Job Guide (my copy is 7.0.1, so your may be different) -...
by T42
Mon Nov 15, 2004 11:07 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: FTP connection
Replies: 8
Views: 3976

Did you attempt to run a very simple 1 column file using the FTP stage with source from the same location? Have you been able to FTP to it directly from the DataStage server successfully? (i.e. use the right paths, using the same username, and so on forth?) Have you ever been successful using a diff...
by T42
Mon Nov 15, 2004 11:03 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Connection problem!
Replies: 5
Views: 3359

Are you using DB2? If so, there is a known issue with DB2 libraries using the same name as your DataStage's libraries. In setting up your paths, make sure that your DataStage paths comes FIRST before anything else. Especially in "bin/dsenv". Bounce the server, and see if this problem conti...
by T42
Mon Nov 15, 2004 10:11 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error Message
Replies: 6
Views: 2486

You also can use the Adminstrator's option to limit number/days of job runs, and allow DataStage to automatically purge old log files for you every time you run a job.

You will get an informative message that you can safely ignore.
by T42
Mon Nov 15, 2004 10:02 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Defining Constraints in Transformer Stage
Replies: 13
Views: 15996

This is yet another perfect example of why it is important to remember: No matter what we use, it must be PRECISE. If you want a flexible rule-based comparison, check out QualityStage. DataStage is superior when it is dealing with precise data comparison. Leading/trailing spaces, extra whitespaces, ...
by T42
Mon Nov 15, 2004 9:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Exception handling
Replies: 4
Views: 3256

It really depends on what you are looking for. Are there any specific situation you wish to find answers for? We could easily answer those type of questions for you. For general purposes, you are welcome to read the DataStage Parallel Job Guide PDF file which comes with your installation of DataStage.
by T42
Fri Nov 12, 2004 8:35 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Oracle 8i and 9i clients on same DS sever?
Replies: 10
Views: 3518

Wooooooooooookay.

*chuckles* And I thought it was a nice way to say, "upgrade! Avoid the evil feds!"
by T42
Fri Nov 12, 2004 8:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SQL0911
Replies: 13
Views: 7362

Be careful. Configurating the db2nodes.cfg will take effect across the entire server. Standard practice is to include $APT_CONFIG_FILE parameter on every job to allow for granularity in controlling number of nodes. One job flow may work best for 4 nodes with 2 DB2 nodes. Another job flow may not eve...
by T42
Fri Nov 12, 2004 8:11 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parallel job reports failure
Replies: 8
Views: 6091

Hi All, I am facing a strange error, which is aborting my jobs. The log file shows "Parallel job reports failure (code Internal error - missing script file RT_SC35/OshExecuter.sh)" I am running few jobs in sequence and sometimes this error occurs and job aborts. If you rerun the job again...