Search found 37 matches

by ramakrishna459
Wed Apr 15, 2015 11:50 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to convert date in Datastage internal format in Parallel
Replies: 4
Views: 4865

As a part of ETL migration we thought of converting server job date format to be generated by parallel job.But we couldnt acomplish by parallel job hence we used the same.
by ramakrishna459
Fri Feb 13, 2015 7:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to convert date in Datastage internal format in Parallel
Replies: 4
Views: 4865

How to convert date in Datastage internal format in Parallel

Hi All,

In server jobs if i give @DATE as derivation its converting to Datastage internal format.

As below:

2015-02-13-->17211

How to acheive the same in parallel jobs.

Thanks
Ramakrishna
by ramakrishna459
Thu Apr 03, 2014 10:41 am
Forum: General
Topic: Subroutine failed to complete successfully (30107)
Replies: 7
Views: 4975

Windows update was not done.Other softwares were installed like PDF converter . Antivirus used is mcaffe both datastage and antivirus or on same OS(operating system).Shall i remove antivirus and try ? . I dont have any system restore points to restore back.Please share your thoughts. Thanks Ramakris...
by ramakrishna459
Thu Apr 03, 2014 10:32 am
Forum: General
Topic: Subroutine failed to complete successfully (30107)
Replies: 7
Views: 4975

Getting the same error message even when trying to login through the administrator cleint.
by ramakrishna459
Mon Mar 24, 2014 3:26 am
Forum: General
Topic: Subroutine failed to complete successfully (30107)
Replies: 7
Views: 4975

But for couple of months i havent faced any issue and i could able to login to datastage and design the jobs and ran it.Now it is giving the above error message.
by ramakrishna459
Sun Mar 23, 2014 12:03 pm
Forum: General
Topic: Subroutine failed to complete successfully (30107)
Replies: 7
Views: 4975

Subroutine failed to complete successfully (30107)

I have datastage installed in my windows 7 pc. Getting below error message when i want to login into datastage designer/director below is the error message . "Subroutine failed to complete successfully (30107)" I have followed the steps provide in the IBM supprot link http://www-01.ibm.com...
by ramakrishna459
Sun Mar 23, 2014 12:01 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Delimited file parallel extraction.
Replies: 2
Views: 2164

Delimited file parallel extraction.

Can we extract the delimited text file using sequential file in parallel mode?
I know we can extract fixed width files.If we couldnt extract the delimited file paralley...what constraints doesnt alow to extract parallely.

Thanks
Rk
by ramakrishna459
Thu Nov 21, 2013 12:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Z/OS Bulk load using datastage Connector stage
Replies: 9
Views: 8725

i have one question... in Connector Stage for bulk load operation we have properties as Discard dataset, Error dataset, Map dataset, work1 dataset and work2 dataset . I don't understand the purpose of those so i have left blank. Do they create the datasets in the mainframe and then load to the datab...
by ramakrishna459
Thu Nov 21, 2013 12:47 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Z/OS Bulk load using datastage Connector stage
Replies: 9
Views: 8725

I have given the ftp ip address of the DB2 database.Do we need to give the ip address of the mainframe or Db2 database.Could you please elaborate on the above comments so that i can well describe about this to admin.
Thanks for your contribution.
by ramakrishna459
Thu Nov 21, 2013 10:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Z/OS Bulk load using datastage Connector stage
Replies: 9
Views: 8725

Below are options we enabled: We have enabled write mode :Bulk Load Enabled Bulk load to DB2 on z/OS:YES Load Method:MVS dataste(s) transfer type:ftp transfer to:given the ip address of the mainframe db2 user id password below options we havent given anything: Datafileattribute Inputdatafiles filedi...
by ramakrishna459
Thu Nov 21, 2013 10:46 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Z/OS Bulk load using datastage Connector stage
Replies: 9
Views: 8725

Please find the below error message we are receiving: Transfer to dataset JDBCBDW.IN00000 failed with error: 550-SVC99 RETURN CODE=4 S99INFO=0 S99ERROR=1224 HEX=04C8 S99ERSN code X'00000FD6'. 550 Unable to create data set JDBCBDW.IN00000 for STOR command. cat: 0652-054 Cannot write to output. There ...
by ramakrishna459
Thu Nov 21, 2013 8:46 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Z/OS Bulk load using datastage Connector stage
Replies: 9
Views: 8725

Re: DB2 Z/OS Bulk load using datastage Connector stage

The DB2 is on Z/OS system we tried with connector stage with bulk load and we gave the ftp ip address ,user id and password.But the job got aborted.
by ramakrishna459
Thu Nov 21, 2013 12:19 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 Z/OS Bulk load using datastage Connector stage
Replies: 9
Views: 8725

DB2 Z/OS Bulk load using datastage Connector stage

Hi ALL,

How to achieve the DB2 Bulk load using DB2 Connector stage? There are many options available so what exactly we can select from those? Please share your thoughts.
by ramakrishna459
Wed Oct 16, 2013 5:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Need to Generate below output using transformer stage.
Replies: 2
Views: 1748

With the help of stage variables and looping and data sorted we acomplished.
by ramakrishna459
Sun Oct 13, 2013 7:25 am
Forum: General
Topic: Need to check if two files exist then only process?
Replies: 6
Views: 4094

we have created unix shellscipt and worked out. Thanks to all