Page 1 of 1

Proble with datasets--One of the nodes is down.

Posted: Mon Sep 13, 2010 6:03 am
by suneyes
Hi,
A dataset was created yesterday on a 3 node configuration.
Currently one among the 3 nodes is down.
We are currently running a job on 2 node configuration which uses this dataset.Thejob is failing with an error quoting that one among the three partitions is not available.

We have the dataset partition file created on the node which is currently down.Can we in any way place this dataset file on one of the two nodes currently runnig and point the datastage to pick the third partition from here..?

Posted: Mon Sep 13, 2010 6:20 am
by mhester
Unfortunately I do not believe that you can restore the partition to one of the other nodes and have it work. The location of the partitions are contained within the .ds header. You can certainly use a different configuration in your job then what was used to create the dataset, but I believe that all of the nodes that were used to create the dataset must be available when reading the dataset.

I am researching this a bit more and will let you know.

Posted: Mon Sep 13, 2010 6:48 am
by suneyes
mhester wrote:Unfortunately I do not believe that you can restore the partition to one of the other nodes and have it work. The location of the partitions are contained within the .ds header. You can certainly use a different configuration in your job then what was used to create the dataset, but I believe that all of the nodes that were used to create the dataset must be available when reading the dataset.

I am researching this a bit more and will let you know.
Thanks Mike,
Please enlighten me if you find some other way to make the datset work..

Posted: Mon Sep 13, 2010 6:48 am
by ArndW
The path information for the components is contained in the .ds file, which is a binary file and shouldn't be edited with a text editor. Can you simulate a logical link to that path using a symbolic link?

Posted: Mon Sep 13, 2010 6:59 am
by Sainath.Srinivasan
Two options
1.) Restore dataset partitions from the failed node into a new one and organise so you can refer with the old particulars
2.) Perform a node copy of available node datasets and append the failed one with new forced configuration file property into the created ones.

Posted: Thu Sep 16, 2010 10:23 pm
by suneyes
Thanks For the replies folks..

As we are runningon an SMP configuration...the dataset files created by node2 are available in the shared memory..

IBM help desk provided solution to the problem.
they have suggested to change the configuration file so that the node2 in the configuration file points to some working node.

This worked fine for us

Posted: Fri Sep 17, 2010 6:10 am
by chulett
Can you share with us how they had you "change the configuration file"? I was under the impression that they were generally not editable. :?

Posted: Fri Sep 17, 2010 6:23 am
by ray.wurlod
This would be the configuration file (not the Data Set descriptor file), and I'd think the advice would be to change it so as not to reference the node that is down. The Data Set (or at least the available part of it) could still be read.

The -x option for the orchadmin command (with a new configuration file) can have the same effect.

Posted: Fri Sep 17, 2010 6:42 am
by chulett
Ah... gotcha.

Posted: Tue Sep 21, 2010 7:02 am
by suneyes
chulett wrote:Can you share with us how they had you "change the configuration file"? I was under the impression that they were generally not editable. :?

Yes ,in our case the changes have been made to configuration file.
But modifying a dataset descriptor file can also be done.

It can be edited in the hexadecimal mode in the text editor like ultraedit

I just changed the references to the failed node to some working node and ...dataset management tool was able to read the contents of that dataset..