Search found 50 matches

by Thomas.B
Mon Oct 12, 2015 8:56 am
Forum: General
Topic: Deploy Only 1 Job From a Package
Replies: 2
Views: 2499

Re: Deploy Only 1 Job From a Package

The only option is "Replace Existing Items" but this does not help me in this scenario Actually that can help you: just delete the job you want to restore then uncheck that box when you import the full archive and the information server manager will ignore all your jobs except the one you...
by Thomas.B
Fri Oct 09, 2015 4:59 am
Forum: General
Topic: Does TortoiseSVN work with IBM InfoSphere DataStage?
Replies: 2
Views: 3107

by Thomas.B
Wed Oct 07, 2015 8:43 am
Forum: General
Topic: How to add or change short description of Multiple jobs.
Replies: 11
Views: 4951

The "Description" line must be added under the job name, on the DSRECORD token, not on the DSSUBRECORD.
by Thomas.B
Fri Oct 02, 2015 9:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading multiple files with same metadata from a list
Replies: 16
Views: 16696

You can do it that way: Create a job to load one sequential file to a table, set your input stage file property option to a job parameter. Create a text file who lists every files you have to load Create a sequence job like that: Execute Command --► Start loop activity ------► Job Activity ▲ | | | |...
by Thomas.B
Wed Sep 30, 2015 2:34 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Rejecting bad records from sequential file
Replies: 4
Views: 2663

If your stage fails when a bad record is read by the sequential file stage, the option 'Reject mode' must be set on 'Fail'.
Just change it to 'Continue' or 'Output' if you want to redirect them to a reject link and your job will process only with the good records.
by Thomas.B
Mon Sep 28, 2015 4:30 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Split the Input Source into Multiple Datasets
Replies: 2
Views: 2643

For me, the first step is to create a job to put all the records from your base to a Dataset. Then, you can create a multiple instance job to modify your schema like that: Input Dataset ---> Modify ---> Output Dataset In the modify, set the specification to "KEEP #PARAM1#" where PARAM1 is ...
by Thomas.B
Wed Sep 09, 2015 2:20 am
Forum: General
Topic: Is there a way to access the purged datastage logs ?
Replies: 3
Views: 2550

Did you try using the Operations Console ?
by Thomas.B
Mon Sep 07, 2015 4:05 am
Forum: General
Topic: Update all Parameter Sets to actual values
Replies: 10
Views: 4291

To avoid that mass update thing you can put your global parameters to the value "$PROJDEF" and define their values on the DataStage Administrator client.
Updating the whole project will be much easier that way.
by Thomas.B
Tue Aug 18, 2015 6:44 am
Forum: General
Topic: ISX file clarification
Replies: 6
Views: 4407

Did you consider using the information server manager source control?
by Thomas.B
Fri Aug 07, 2015 3:38 am
Forum: General
Topic: Sequence status is 99
Replies: 18
Views: 11215

Yes passing the job name. invocation id That's strange, i just try : hJob = DSAttachJob('Jx_001_JobControl', DSJ.ERRFATAL) Status = DSGetJobInfo(hJob, DSJ.JOBSTATUS) Here Statuts = 99 Then i try : hJob = DSAttachJob('Jx_001_JobControl.L_01', DSJ.ERRFATAL) Status = DSGetJobInfo(hJob, DSJ.JOBSTATUS) ...
by Thomas.B
Wed Aug 05, 2015 2:03 am
Forum: General
Topic: Sequence status is 99
Replies: 18
Views: 11215

To get the real statut of the job, 'JobName' have to contain the job name and the invocation ID.
by Thomas.B
Wed Jul 15, 2015 3:39 am
Forum: General
Topic: Null Handling Function on Environment Variable.
Replies: 7
Views: 3438

Job parameters can't be null, your job will abort if they are. To check if the parameters are given or not by the main program you can initialize them with a default value and in the job, check for that value. Ex: Parameter : LPAR_TMP -> Default value : 'NULL' On a Transformer : If LPAR_TMP = 'NULL'...
by Thomas.B
Tue Jul 07, 2015 8:15 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Rounding Issue
Replies: 3
Views: 3303

DataStage will adapt the input data to the output format, so, from a decimal to a smallint field just mapping your field will do the required conversion
by Thomas.B
Thu Jun 25, 2015 2:55 am
Forum: General
Topic: Datastage path and export clarification
Replies: 5
Views: 4118

Is >That< the link you want ?
by Thomas.B
Mon Jun 22, 2015 1:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ^Z characters
Replies: 6
Views: 3699

You can compare the 'DSParams' files from your projects to see which parameter is different.
The 'DSParams' files are on your projects folders, ex :
/opt/datastage/IBM/InformationServer/Server/Projects/ADMIN