Question on Configuration file

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Question on Configuration file

Post by JPalatianos »

Hi,
We are beginning our migration from Version 8.0.1 to 8.7 and I just tried running a couple of jobs. The first job is reading data from a DB2 Database and writing to a DataSet:

ODBC enterprise===> Transformer====> DataSet

In version 8.0.1 we had the DataSet writing to the directory: D:\rldm_psdw\rldm\WorkFiles\RLDM_DLY_UNIVL_CUST_DIM_INIT.ds
and the config file is defined as:

Code: Select all

{
	node "node0"
	{
		fastname "PAERSCBBLA0681"
		pools ""
		resource disk "D:/IBM/InformationServer/Server/Datasets/node0" {pools ""}
		resource scratchdisk "D:/IBM/InformationServer/Server/Scratch/node0" {pools ""}
	}
}

In 8.7 we created a second drive (E:) to separate the work areas from the engine which is installed on the D: drive.

The job is now trying to write to the E:\GP_rldm\rldm\WorkFiles\RLDM_DLY_UNIVL_CUST_DIM_INIT.ds directory and the config file is defined as:

Code: Select all

{
	node "node1"
	{
		fastname "NJROS1A2495"
		pools ""
		resource disk "E:\TMP\Datasets" {pools ""}
		resource scratchdisk "E:\TMP\Scratch" {pools ""}
	}
	
}
When I run the job on the 8.7 server I receive the error:
E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds: Error when checking operator: No partitions of E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds are accessible fromthe nodes in the configuration file.

I tested the job changing the dataset to E:\TMP\Datasets\RLDM_DLY_UNIVL_CUST_DIM_INIT.ds using the same confif file as the job that failed and this job ran too success.

I am a bit confused as to what I shpould be coding for our configuration file.

Thanks - - John
atul9806
Participant
Posts: 96
Joined: Tue Mar 06, 2012 6:12 am
Location: Pune
Contact:

Post by atul9806 »

Hi JPalatianos
May be I am wrong but Can you please check the "Path" written in configuration files ?

Code: Select all

{
   node "node1"
   {
      fastname "NJROS1A2495"
      pools ""
      resource disk "E:\TMP\Datasets" {pools ""}
      resource scratchdisk "E:\TMP\Scratch" {pools ""}
   }
   
}
It should be .. '/' not '\'

Code: Select all

{
   node "node1"
   {
      fastname "NJROS1A2495"
      pools ""
      resource disk "E:/TMP/Datasets" {pools ""}
      resource scratchdisk "E:/TMP/Scratch" {pools ""}
   }
   
}
~Atul Singh
<a href=http://www.datagenx.net>DataGenX</a> | <a href=https://www.linkedin.com/in/atulsinghds>LinkedIn</a>
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Post by JPalatianos »

Hi Atul,
I changed the back slash to a forward slash in the config file but am still getting the same error for the job:

E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds: Error when checking operator: No partitions of E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds are accessible fromthe nodes in the configuration file.
Thanks - - John
PaulVL
Premium Member
Premium Member
Posts: 1315
Joined: Fri Dec 17, 2010 4:36 pm

Post by PaulVL »

Are you a clustered environment?

Is that E drive present on NJROS1A2495?

Also, I'd think about putting your scratch disk in the scratch pool. Not a requirement, but it might be handy to do for later on if you define more than one scratch disk.

is E:/GP_rldm/rldm/WorkFiles/ a valid path?

Is the user id you are running with allowed to read/write to that path?
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Post by JPalatianos »

Thank You all for your sugeestions. After I started testing the adevelopers issue it appears not to be an issue with the Cinfig file at all. I was able to test by copying the existing job and renaming the output DataSet. The developer copied all the (.ds) files from the olde server to the new and I believe that is what was causing the failure.

I went into teh DataSet Management utility and deleted the DataSet in question. My problem is I am still getting the same error for the original DataSet:
E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds: Error when checking operator: No partitions of E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds are accessible fromthe nodes in the configuration file.

Is tehre another way to completely clean up the dataset?
Thanks - - John
PaulVL
Premium Member
Premium Member
Posts: 1315
Joined: Fri Dec 17, 2010 4:36 pm

Post by PaulVL »

since this is a new install, is there any reason you can't re-create the DS from it's source?

You could dump the old dataset to a sequential file in setup #1, then read the sequential file and make a dataset in setup #2.
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Post by JPalatianos »

My goal is to maintain the name of the Dataset since it is used in many of the downstream jobs. I can easily recreate a new Dataset but not the one I am trying to delete first.
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

.ds file keeps a copy of configuration file with which it was created. Hence when you try to re-write it uses that configuration file to identify the previous datafiles.

Since you have copied it from a different path and placed in a different one, its not able to access that and hence the issue.

You can directly delete the .ds and corresponding datafiles from the resource disk directory. It is suggested to use orchadmin to delete the dataset as it doesn't leave the leftovers.

If you are directly deleting the .ds make sure you delete the data file as well which will start with the dataset name and should be present in resource directory or in the directory it was copied.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Post by JPalatianos »

Would you know where the orchadmin executable resides on a WIndows installation (Version 8.7)?
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Post by JPalatianos »

Just found it: "D:\IBM\InformationServer\Server\PXEngine\bin" Serach was not working on our serevr. I will give this a try and let you know how it goes/
mandyli
Premium Member
Premium Member
Posts: 898
Joined: Wed May 26, 2004 10:45 pm
Location: Chicago

Post by mandyli »

Hi,

First of all these dataset are created by 8.1 config setup. So you can't use same file for 8.7 . Since there is kind of different between 8.1 to 8.7.

And also Is there any ETL DATASTAGE Admin in your office?

This is kind of Admin work need to fix this.

Can you please send me us about your Datastage environment?


Thanks
Man
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

He is looking to delete the datasets so that should not be any problem.

Orchadmin will not work here as it will try to delete with the configuration file present in descriptor and if you override, the data files will not be deleted.

You need to just select the file and Shift+Delete, anyways on this system they are not usable so why even send it to recycle bin :wink:

If you need that data then you need to use orchadmin dump on old system to get it in text file then create the datasets using that file in the new system.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

mandyli wrote:First of all these dataset are created by 8.1 config setup. So you can't use same file for 8.7 . Since there is kind of different between 8.1 to 8.7.
Would you mind to enlighten me on the difference?
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Post by JPalatianos »

Hi,
I am the DataStage Admin here. We have just installed DS version 8.7 FP 2 on Windows 2008 R2.

I am trying to work around an error that this one application is getting:
E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds: Error when checking operator: No partitions of E:/GP_rldm/rldm/WorkFiles/RLDM_DLY_UNIVL_CUST_DIM_INIT.ds are accessible fromthe nodes in the configuration file.

When I create a copy of this job and write to any other Dataset i.e. RLDM_DLY_UNIVL_CUST_DIM_INITtst2.ds it works fine. not sure what is causing the error when running with the original DS name RLDM_DLY_UNIVL_CUST_DIM_INIT.ds
Thanks - - John
JPalatianos
Premium Member
Premium Member
Posts: 306
Joined: Wed Jun 21, 2006 11:41 am

Post by JPalatianos »

Looks like the app area that was running this job ran it multiple times hardcoding the DS path on a few occasions. I was able to find all the DataSets, deleted them and the job runs fine know.
Thanks - - John
Post Reply