Search found 89 matches
- Fri Dec 21, 2012 4:06 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Data getting trucated in DB2 Connector
- Replies: 1
- Views: 2072
Data getting trucated in DB2 Connector
Hi, I have job that runs a query on DB2 and dumps the data in to a dataset. One of the columns has data like this - " abc & def " and when extracting data datastage is truncating the data after the & character. Another strange thing is it is not always truncating the data that come...
- Tue Jun 19, 2012 1:41 pm
- Forum: General
- Topic: Server job not running through job sequence
- Replies: 2
- Views: 2008
Server job not running through job sequence
I have a server job that gets a list of file names from a given directory and writes them to a hashed file. This job when run individually works fine. But when I run this through a job sequence it runs and finishes but the does not create any records in the hashed file. Immediately if I just kick of...
- Tue Mar 20, 2012 12:25 pm
- Forum: IBM<sup>®</sup> SOA Editions (Formerly RTI Services)
- Topic: WS Transformer failure
- Replies: 5
- Views: 6468
One useful debugging trick is to try it in a Server Job and turn on tracing....(at the run dialog, there is a trace option...select your stage and then click all four options on the right)......you will get a whole lot of log data, and at the center will be the xml soap request....and the xml soap ...
- Sun Mar 18, 2012 11:50 am
- Forum: IBM<sup>®</sup> SOA Editions (Formerly RTI Services)
- Topic: WS Transformer failure
- Replies: 5
- Views: 6468
Ernie, No, This WS Transofrm has not exactly worked so far. I am still working on the developing this job. When I was passing fewer columns and tried to capture the reply from WS Transform in a file, it seemed like it was working, except that it was throwing an error message in the XML itself, but a...
- Fri Mar 16, 2012 4:23 pm
- Forum: IBM<sup>®</sup> SOA Editions (Formerly RTI Services)
- Topic: WS Transformer failure
- Replies: 5
- Views: 6468
WS Transformer failure
I have a job that takes XML input and invokes a web service. The job has been aborting with the below message. I made sure that the XML is ok and was able run it through Soap UI to get reply from the same web service. Just the web service transformer is not liking the input and not giving much infor...
- Fri Sep 09, 2011 12:44 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Parallel job reports failure (code 139) from Sequence
- Replies: 2
- Views: 3674
Parallel job reports failure (code 139) from Sequence
Hi, I am getting the Parallel job reports failure (code 139) error with 30571 Segmentation fault (core dumped) $APT_ORCHHOME/bin/osh "$@" -f $oshscript >$oshpipe 2>&1 message. My job runs fine individually but its giving me this error when I try to run the job within a job sequence. An...
- Fri Jan 28, 2011 1:04 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reading Multiple DataSets using File Pattern
- Replies: 5
- Views: 6639
- Thu Jan 27, 2011 10:55 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reading Multiple DataSets using File Pattern
- Replies: 5
- Views: 6639
Reading Multiple DataSets using File Pattern
Hi, Is there a way to read multiple DataSets with similar pattern in a single stage? For example, I have multiple datasets with same metadata and same partitioning. TestData_1.ds TestData_2.ds TestData_3.ds I would want to pick up all these datasets TestData_*.ds like we can do for flat files. I don...
- Thu Jan 27, 2011 10:30 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Multiple Jobs Reading a single dataset concurrently
- Replies: 7
- Views: 7024
Wrong. Separate images are loaded for each link. So if you have three links referring to the same Data Set, you get three copies of it in memory. Note that using shared memory for Entire partitioning in an SMP architecture applies to each reference link - you still get one copy (in shared memory th...
- Thu Jan 27, 2011 4:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Multiple Jobs Reading a single dataset concurrently
- Replies: 7
- Views: 7024
No, but Job2 and Job3 can concurrently read the Data Set created by Job1. There's no limit until you run out of memory - each loads the Data Set into virtual memory when it's on a reference input link to a Lookup stage. What happens if it is a Mege or Join stage I am using for looking up? In this c...
- Wed Jan 26, 2011 11:23 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Multiple Jobs Reading a single dataset concurrently
- Replies: 7
- Views: 7024
Multiple Jobs Reading a single dataset concurrently
Hi, I am working on a job design which looks like below. Job 1 - loads dataset A , Job 2- Reads dataset A and does a look up against input data 1, Job 3 - Reads dataset A and does a look up against input data 2. I am planning to run Job1 first and then run Job 2 and Job3 parallely. My question is Ca...
- Wed Jan 12, 2011 6:50 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Lookup stage performance Versu Merge stage performance
- Replies: 5
- Views: 3836
What is the slow part of your job, the lookup/merge or parsing the XML file? If you have the new XML assembly that can be added to DataStage 8.5 you should get massive XML processing improvements. If you are parsing it using a sequential file stage you could try multiple readers. Through put after ...
- Wed Jan 12, 2011 5:14 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Lookup stage performance Versu Merge stage performance
- Replies: 5
- Views: 3836
Lookup stage performance Versu Merge stage performance
Hi, I have job which parses an xml file and looks up against a dataset(table dump) and if the keys are existing it will return the key, if not it will generate a new key and writes to a dataset. My data volumes are really huge, So, once the lookup dataset got close 3.5GB the job was failing due to l...
- Thu Dec 16, 2010 5:24 pm
- Forum: General
- Topic: Sequencer ALL or Any - I need both
- Replies: 11
- Views: 7729
Here is the approach I have now JobAct1-----> JobAct2-----> NestCondition -----> Seq1 Seq2 -->JobAct3-->JA4 Seq3 Seq1, Seq2 and Seq3 get kicked off from the NestCondition and after the completion of ANY or ALL of them JobAct3 and JA4 should get kicked off. And, the some times only Seq1 may run but n...
- Thu Dec 16, 2010 11:36 am
- Forum: General
- Topic: Sequencer ALL or Any - I need both
- Replies: 11
- Views: 7729