Migrating DataStage resources...a sort of poll
Moderators: chulett, rschirm, roy
Migrating DataStage resources...a sort of poll
Doing some research on migration methodologies. When you are moving from dev to test to production, what are your experiences with design-only .dsx's .....vs..... .dsx's that contain all the compiled information....? [or what other methods are you employing for such migration of your jobs and other artifacts?]....
Do you only use one vs the other? Why or why not?
Thanks!
Ernie
Do you only use one vs the other? Why or why not?
Thanks!
Ernie
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
Ernie,
a bit of both. Using .dsx's with compiled code will work only if the two environments are absolutely identical, i.e. all relevant paths to executables, projects, libraries, system components, DB components, etc.
Since this is not always the case, I tend to prefer non-executable deployments with a complete compile using the multiple job compiler. If this complete project compile has more than 0 errors then I know something has gone wrong.
Funny you should post this, I am in the middle of doing a bi-weekly deployment from ClearCase on 5 machines - each deployment takes about 2 hours (the smallest system is a new AIX box with only 12 CPUs...).
a bit of both. Using .dsx's with compiled code will work only if the two environments are absolutely identical, i.e. all relevant paths to executables, projects, libraries, system components, DB components, etc.
Since this is not always the case, I tend to prefer non-executable deployments with a complete compile using the multiple job compiler. If this complete project compile has more than 0 errors then I know something has gone wrong.
Funny you should post this, I am in the middle of doing a bi-weekly deployment from ClearCase on 5 machines - each deployment takes about 2 hours (the smallest system is a new AIX box with only 12 CPUs...).
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
My choice: export minus executables, compile in new environment. We're fairly rigorous about recording and picking up dependencies, but the compile picks up any that are missed (for example there's no mechanism within the product to record dependent Transforms).
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Hey Ernie,
Currently using Version Control (VC) exclusively for migration from dev -> qa -> ua -> prod, and indicate to VC to compile after promoting the code.
In preparation for 8.x and no VC (sigh), we are developing a work flow using CVS for code control and moving/managing only code, no executables. Code is compiled on the target host after the export.
Seldom, if ever, have I been to client (or recommended to them) moving the executables.
-Craig
Currently using Version Control (VC) exclusively for migration from dev -> qa -> ua -> prod, and indicate to VC to compile after promoting the code.
In preparation for 8.x and no VC (sigh), we are developing a work flow using CVS for code control and moving/managing only code, no executables. Code is compiled on the target host after the export.
Seldom, if ever, have I been to client (or recommended to them) moving the executables.
-Craig
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
I agree, move and then compile all, tells you whether anything is wrong and sets the new set of parameter default values. If you modify your job parameter default values during the migration (eg. via search and replace) the compile all approach helps set those new default values. Not so much of an issue in version 8 with parameter sets.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
These are the answers I was expecting....thank you for the confirmation everyone. The research is being done for ISD, which also has a model of design export or design + executables, and while the concept is a bit different (in terms of expectations for J2EE Application Server management), the variables and pitfalls are largely the same (for keeping things identical). It's a bit of an apples to oranges comparison, but your input illustrates real "practical" field experience.
Thanks!
Ernie
Thanks!
Ernie
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Ernie
We have a DataStage admin team. We scripted a solution using DOS batch files which do almost all of what VC does. You create a list of jobs to migrate. These are stored in a folder with the same name as the project. You edit one batch file to pick the command line options of from and to server names, user names, project names. You run the script. It exports to a DSX. A Perl script strips the dependent objects like shared containers. The new DSX is imported into To_Project. Job is compiled. A record is inserted into a SQL Server table as an audit trail. The process is repeated for all jobs in the list. The whole process is logged so you can edit the log file afterwards. All files are archived at the beginning of the next migration.
At the end the DSX files are copied into a folder controlled by CVS. All of the DSX files are checked into CVS.
BlueCross would probably share this script with IBM if you are interested. They will not let me post it in my web site. They might if IBM asked them to. It works very well. Been in use for many months now.
We have a DataStage admin team. We scripted a solution using DOS batch files which do almost all of what VC does. You create a list of jobs to migrate. These are stored in a folder with the same name as the project. You edit one batch file to pick the command line options of from and to server names, user names, project names. You run the script. It exports to a DSX. A Perl script strips the dependent objects like shared containers. The new DSX is imported into To_Project. Job is compiled. A record is inserted into a SQL Server table as an audit trail. The process is repeated for all jobs in the list. The whole process is logged so you can edit the log file afterwards. All files are archived at the beginning of the next migration.
At the end the DSX files are copied into a folder controlled by CVS. All of the DSX files are checked into CVS.
BlueCross would probably share this script with IBM if you are interested. They will not let me post it in my web site. They might if IBM asked them to. It works very well. Been in use for many months now.
Mamu Kim
hehe
Perl??? Did someone say Perl??
;-P
;-P
Bestest!
John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services
"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services
"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
Version 8.1 has just been released for Windows and it comes with the Information Server Manager - a deployment tool that can interface with common source control tools. It has a GUI but all functions can also be activated from the command line or from a script.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Are any of those scripting objects documented anywhere, Vincent? I have been having horrid performance using the v8 gui. I even boosted my local machine to 4 GB RAM and installed the client locally to get better performance than running through Citrix like we normally do. I saw some improvement, but not where it counts most. For example, compiling under v7.5.1a would take about 2 seconds per job. Under v8.0.1, it is around a minute per job. I have 1300 jobs for this project. I am currently trying to figure out a command line script to run the compile with and see if it makes a difference.
Bestest!
John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services
"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services
"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
-
- Participant
- Posts: 3593
- Joined: Thu Jan 23, 2003 5:25 pm
- Location: Australia, Melbourne
- Contact:
I saw some compile options on version 8.0.1 on AIX. Sometimes compiles took a few seconds, sometimes it took a couple minutes. Most other actions were fast - it only seemed to be the compiles that were dodgy. I think it was something to do with the compile engine and jobs with Transformers in them. I would open a case with IBM support as your compiles shouldn't take that long.
Certus Solutions
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn
Blog: Tooling Around in the InfoSphere
Twitter: @vmcburney
LinkedIn:Vincent McBurney LinkedIn