migration thorugh cobol code and Datastage PX
Moderators: chulett, rschirm, roy
migration thorugh cobol code and Datastage PX
Hi all,
Please let me know metrics on comparision between migration thorugh cobol code and through datastage parallel extender.
We are doing some POC work for client Our requirement is from IMS to DB2 migration)
Regards,
Mohan
Please let me know metrics on comparision between migration thorugh cobol code and through datastage parallel extender.
We are doing some POC work for client Our requirement is from IMS to DB2 migration)
Regards,
Mohan
MOHAN
The question is rather vague, so I would say (with regards to metrics) that COBOL is a 13 while DataStage is a 23.
Seriously, what are you looking for? Total effort required for each? Shortest project runtime? Least resources? Best looking code?
Is the DB2 target on the host or on a UNIX machine? Does the site have IMS and Cobol experts or DataStage people?
Seriously, what are you looking for? Total effort required for each? Shortest project runtime? Least resources? Best looking code?
Is the DB2 target on the host or on a UNIX machine? Does the site have IMS and Cobol experts or DataStage people?
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
I'm assuming you are decommissioning an IMS database and moving to DB2 on the same mainframe and you want to know how to handle the data conversion. DataStage is going to be a much faster development tool and DataStage developers are usually easier to find than Cobol programmers - though maybe not in your home town! Costing depends a lot on what architecture you can go with.
DataStage MVS for example has a pretty big up front license cost and ongoing MIPS based licensing for all cobol programs.
Information Server for zLinux however runs on a partition on the mainframe using much cheaper processor based licensing but as it runs on the mainframe you can keep your data on the mainframe during conversion.
Another factor is longevity. Do you want to run Information Server over a 6-12 month period to just cover the migration? If so you want to know if IBM can give you cheaper short term licensing. If you want to use your Information Server post migration for interfaces and reporting then this spreads the up front license costs out across more projects.
Most ROI calculations from ETL vendors will tell you a GUI ETL tool will be anywhere from twice as fast to ten times as fast as manual coding and the tens of thousands of ETL tool customers around the world back that up. This speeds up just the coding section - not the analysis and design or testing. You can add FastTrack and Information Analyzer to try and cut down on the costs of those efforts.
DataStage MVS for example has a pretty big up front license cost and ongoing MIPS based licensing for all cobol programs.
Information Server for zLinux however runs on a partition on the mainframe using much cheaper processor based licensing but as it runs on the mainframe you can keep your data on the mainframe during conversion.
Another factor is longevity. Do you want to run Information Server over a 6-12 month period to just cover the migration? If so you want to know if IBM can give you cheaper short term licensing. If you want to use your Information Server post migration for interfaces and reporting then this spreads the up front license costs out across more projects.
Most ROI calculations from ETL vendors will tell you a GUI ETL tool will be anywhere from twice as fast to ten times as fast as manual coding and the tens of thousands of ETL tool customers around the world back that up. This speeds up just the coding section - not the analysis and design or testing. You can add FastTrack and Information Analyzer to try and cut down on the costs of those efforts.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Not sure if this is what you are looking for or not, but here goes.
We had programs that were created by PRISM software. If anyone remembers this tool, it generated COBOL code that ran on Unix, Windows, Linux. We ran it on AIX using MicroFocus COBOL.
I am not a mainframer, but many of my coworkers are and they said the code is very similar to what they would use on the mainframe and ran in much the same fashion.
Jobs that used to take hours via COBOL were taking 15-20 minutes in DataStage. One of the biggest benefits is that it is no longer a serial process. Everything is streamed and runs in parallel.
Hope this helps.
Brad.
We had programs that were created by PRISM software. If anyone remembers this tool, it generated COBOL code that ran on Unix, Windows, Linux. We ran it on AIX using MicroFocus COBOL.
I am not a mainframer, but many of my coworkers are and they said the code is very similar to what they would use on the mainframe and ran in much the same fashion.
Jobs that used to take hours via COBOL were taking 15-20 minutes in DataStage. One of the biggest benefits is that it is no longer a serial process. Everything is streamed and runs in parallel.
Hope this helps.
Brad.
It is not that I am addicted to coffee, it's just that I need it to survive.
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
Is that the tool that became known as DataStage MVS? It runs DataStage on a Unix/Windows server using a custom palette of stages and generates cobol code - it transfers this to the mainframe and compiles it. It makes development a lot faster but has a bigger price tag than the average DataStage product.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
I am not sure. I know that there is some historical connection between PRISM and Ascential. When we used it, PRISM created COBOL code to be run on Unix, but I suppose it could also have gone on the mainframe in which case who knows. Maybe it is the PRISM engine that generates the DataStage MVS COBOL code.
Inquiring minds want to know. Who is the resident historian on DSXChange? Ray?
Brad.
Inquiring minds want to know. Who is the resident historian on DSXChange? Ray?
Brad.
It is not that I am addicted to coffee, it's just that I need it to survive.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Yes, PRISM came "into the fold" when VMARK and UniData merged to form Ardent, PRISM (more formally PRISM Warehouse Executive, sometimes PWE, sometimes pronounced "pee wee") having been acquired earlier than that by UniData.
Much of the PRISM architecture - especially the idea of JCL Templates - underpins how DataStage mainframe jobs (currently marketed as Enterprise MVS Edition) create the JCL and COBOL that is transferred to the mainframe. Even without tweaking the JCL Templates, the generated code is surprisingly good. And the mainframe can "parallelize" the processing if desired.
Much of the PRISM architecture - especially the idea of JCL Templates - underpins how DataStage mainframe jobs (currently marketed as Enterprise MVS Edition) create the JCL and COBOL that is transferred to the mainframe. Even without tweaking the JCL Templates, the generated code is surprisingly good. And the mainframe can "parallelize" the processing if desired.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Mohandl - the numbers are absolutely meaningless; you asked for metrics without detailing what you wanted, so I did a typical consultant type of thing and gave you metrics
The "best looking code" part doesn't have much meaning, either.
First of all, you need to tell the forum if you are doing a Mainframe to Mainframe conversion or to another platform. Also what skill sets you have and also if you have already purchased software (i.e. does a DS license need to be acquired or is it already there, for "free").
The question you have asked is similar to "Is a Ferrari or Suburban better?".
The "best looking code" part doesn't have much meaning, either.
First of all, you need to tell the forum if you are doing a Mainframe to Mainframe conversion or to another platform. Also what skill sets you have and also if you have already purchased software (i.e. does a DS license need to be acquired or is it already there, for "free").
The question you have asked is similar to "Is a Ferrari or Suburban better?".
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
Mohan, the "metrics" that you keep asking for is what you are supposed to be putting together, not us. You know the coding standards, resources and methodologies of the customer and the customer services. We have given you the architecture options. I suspect you are not the right person to document the ROI comparisons of each architecture option but all we can do is give you what we know. Your local IBM software office might be able to give you some comparisons between DataStage and manual coding but they are likely to be weighted in the favour of DataStage.
A lot depends on what resources you have available, what level of executive support you can get for each approach, how your project is budgeted, what price discounting IBM can offer, what the strategic data integration strategy is, what the downstream interface requirements are, what other source and target databases may be involved etc etc, what existing vendor relationships the company has, what ETL tools are currently in the company etc etc etc.
I suggest you find the right enterprise architect within the company who can answer some of these questions.
A lot depends on what resources you have available, what level of executive support you can get for each approach, how your project is budgeted, what price discounting IBM can offer, what the strategic data integration strategy is, what the downstream interface requirements are, what other source and target databases may be involved etc etc, what existing vendor relationships the company has, what ETL tools are currently in the company etc etc etc.
I suggest you find the right enterprise architect within the company who can answer some of these questions.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
-
- Premium Member
- Posts: 783
- Joined: Mon Jan 16, 2006 10:17 pm
- Location: Sydney, Australia
Which platform at the host side? Im sorry this forum may not be the best place for quick answers to global questions .. you will need to do some work on your side of the question, and probably ask the client . That being said , we are ready to answer specific TECHNICAl questions about issues that we possibly have encountered during our career or are aware of by having read up on the DOCUMENTATION(Most of which is freely downloadable).DB2 target on the host side.