Config file issue with orchadmin

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
zulfi123786
Premium Member
Premium Member
Posts: 730
Joined: Tue Nov 04, 2008 10:14 am
Location: Bangalore

Config file issue with orchadmin

Post by zulfi123786 »

I am trying to dump a dataset using orchadmin utility, the output is pasted below :


==> orchadmin dump -name /data/ds/hbus/dev/etl/datasets/copyAcctArrQc78.ds
##I TFCN 000001 04:56:21(000) <main_program>
Ascential DataStage(tm) Enterprise Edition 7.5.1A
Copyright (c) 2004, 1997-2004 Ascential Software Corporation.
All Rights Reserved


##I TFSC 000001 04:56:22(000) <main_program> APT configuration file: /tmp/aptoa27853769d0e243d
ARR_ID_ACCT:one ARR_ID_APP:two ACCT_SVCE_NUM:three MF_7_POS_FLD_1_A:four
##I TFOP 000059 04:56:22(000) <APT_RealFileExportOperator in APT_FileExportOperator,0> Export complete; 1 records exported successfully, 0 rejected.
##I TFSR 000010 04:56:23(000) <main_program> Step execution finished with status = OK.


The config file used as per above is APT configuration file: /tmp/aptoa27853769d0e243d

how to set the value to some other file? as i have used the below command but doesn't seem to work.

export APT_CONFIG_FILE=/opt/tools/ds/Ascential/DataStage/Configurations/hbus-1Node.apt
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

That config file is just a temporary one, not the one stored in the dataset. Use "orchadmin describe {dataset}" or "orchadmin ll {dataset} to get the configuration file information.
zulfi123786
Premium Member
Premium Member
Posts: 730
Joined: Tue Nov 04, 2008 10:14 am
Location: Bangalore

Post by zulfi123786 »

orchadmin describe also shows the same config file


==> orchadmin describe /data/ds/hbus/dev/etl/datasets/copyAcctArrQc78.ds
##I TFCN 000001 05:57:43(000) <main_program>
Ascential DataStage(tm) Enterprise Edition 7.5.1A
Copyright (c) 2004, 1997-2004 Ascential Software Corporation.
All Rights Reserved


##I TFSC 000001 05:57:43(001) <main_program> APT configuration file: /tmp/aptoa1892592db6b77a0
Name: /data/ds/hbus/dev/etl/datasets/copyAcctArrQc78.ds
Version: ORCHESTRATE V7.5.1 DM Block Format 6.
Time of Creation: 02/27/2010 04:32:10
Number of Partitions: 1
Number of Segments: 1
Valid Segments: 1
Preserve Partitioning: false

##I TFOP 000059 05:57:44(000) <APT_RealFileExportOperator in APT_FileExportOperator,0> Export complete; 8 records exported successfully, 0 rejected.
##I TFSR 000010 05:57:44(000) <main_program> Step execution finished with status = OK.
ersunnys
Participant
Posts: 29
Joined: Wed Sep 13, 2006 1:39 pm
Location: Singapore

Post by ersunnys »

Hey,

Dataset in Datastage keeps the cofig file information which is used when we write the Dataset.

You do not use external APT_CONFIG file while reading a Dataset, Datastage use the config file information which got stored in it while writing, so at time of reading it creates a temporary config file and use it.

You can not set the config file while reading a Dataset, if you want to change the node configuration then need to rewrite the Dataset it required config file.

Sunny
Regards,
Sunny Sharma.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Sunny is mostly correct but I do need to correct one fallacy. If a dataset is written with a 2-node configuration there is nothing that prevents a 4-node configuration job to read it; DataStage will automatically and implicitly repartition the dataset.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Review all the options of orchadmin, particularly -x option.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply