Search found 8 matches
- Thu Jul 16, 2009 7:30 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: want a XML format from Sequential file
- Replies: 2
- Views: 1329
want a XML format from Sequential file
Empno,Empname,Place,Phone_number 1,kumar,bangalore,456 1,kumar,tirupati,789 2,krishna,mumbai,111 2,krishna,chennai,222 i have a requirement like one person is having multiple address and phone numbers i want to display in xml as below 1-- kumar-- bangalore tirupati 456 789 2-- krishna-- mumbai chenn...
- Wed May 27, 2009 5:34 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Want to add hours to Timestamp in Parallel Job
- Replies: 2
- Views: 1524
Want to add hours to Timestamp in Parallel Job
Hi all
I have a requirement like this:
Input is like 2009-07-04 03:34:00.000
i want output like 2009-07-04 15:34:00.000
i want to add 12 hours to get output
thanks in advance.....
I have a requirement like this:
Input is like 2009-07-04 03:34:00.000
i want output like 2009-07-04 15:34:00.000
i want to add 12 hours to get output
thanks in advance.....
- Tue Feb 03, 2009 12:21 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reading error but job finished successfully
- Replies: 4
- Views: 2749
hi thanks for reply actually there is a record in the dataset and im able to view also.......... Hi.. There was a issue with the ds 7x versions for which it doenst aborts the job whenever there is a warning for update statments. If row is not present in the database and there is update statement to ...
- Mon Feb 02, 2009 5:36 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Reading error but job finished successfully
- Replies: 4
- Views: 2749
Reading error but job finished successfully
HI all im getting error while reading source dataset and updating record in a table. job shows finished status but there is no records showing in the designer and also not updating in the table. in the director log there are no warning or fatal error..... some info it is showing like below Contents ...
- Mon Jan 05, 2009 5:02 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Message queue stage issue
- Replies: 3
- Views: 1892
hi Ernie thanks for reply.. actually we are using message broker to generate segmented messages. so my question is...can datastage recognize the segmented message or differentiate these messages. or is there any option that datastage can process only once segment per run.(second should not be ....) ...
- Tue Dec 30, 2008 6:07 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Message queue stage issue
- Replies: 3
- Views: 1892
Message queue stage issue
Hi All, we are using Message queue stage(Parallel job-Datastage) for recieving messages from Java application... Daily our jobs should run to process one File....or one segmentation... this segmentation is splits into multiple messages.... if for example if gets two segments...then in this scenario ...
- Mon Dec 15, 2008 5:46 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: spliting single column values
- Replies: 9
- Views: 3359
2nd column may contain values dynamically...... as follows Sample input: 100|toys,car,bus 200|hat,soap,kite,lux 300|jugs,mug Sample output should be like this: 100,toys 100,cars 100,bus 200,hat 200,soap 200,kite 200,lux 300,jugs 300,mug so i think datastage cant handle this rigth........... so im wo...
- Fri Dec 12, 2008 8:34 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: spliting single column values
- Replies: 9
- Views: 3359
spliting single column values
Hi everybody... Please help me regarding this issue asap. we have requirement ... source file contains a column contains a id (primary key) and other(2nd) column which is having one or more values like (toys,car,bus) here comma is not a filed delimeter and . Filed delimiter is a '|'. for example 100...