Page 1 of 2

expecting 1 segments, read 2 segments

Posted: Tue Aug 12, 2008 4:48 am
by snt_ds
Hi All,

While i am trying to appending the same dataset from different jobs i am getting the following warning although the job is running fine.Could any one elaborate on this warning.

main_program: Dataset: /detld2/etl/GPA/WM/log//Exception_613.ds, expecting 1 segments, read 2 segments.

Your help will greatly appreciated.
Thanks in Advance
Ramesh Venkata

Posted: Tue Aug 12, 2008 5:45 am
by balajisr
Are you using the same configuration file in both jobs?

Posted: Tue Aug 12, 2008 5:46 am
by mahadev.v
Are you using the same configuration file? And Same partitioning on the input to the DataSet, for both the jobs?

Posted: Tue Aug 12, 2008 5:52 am
by ray.wurlod
Using the Data Set Management utility determine how many segments (data files) exist on each partition.

I'd be concerned about the // in the pathname also. But this is unrelated to any message about segments.

Posted: Tue Aug 12, 2008 6:18 am
by snt_ds
Hi,

Actually multiple jobs in a sequence job is using the same (.ds) file and the file is being appended, we are using the default partitioning method(Auto).

All the way we are using execution mode as Prallel in all the jobs. Will it get solved if we change the option to Sequential in all the jobs.

And when we view the dataset file in unix , it shows two nodes.

Thanks for all your response.

Posted: Tue Aug 12, 2008 6:55 am
by snt_ds
Hi,

Even we tried with hash partitioning but the same thing is happening.

Can you please share if there is a possible solution to fix this.

Posted: Tue Aug 12, 2008 7:01 am
by ArndW
Is the dataset always being opened in "append" mode?

Posted: Tue Aug 12, 2008 7:08 am
by snt_ds
yes it should be in Append mode

Posted: Wed Aug 13, 2008 1:25 am
by snt_ds
Hi,

Will it be a problem if we use Append mode.

Could you please through some inputs to solve this.

Posted: Wed Aug 13, 2008 1:50 am
by ray.wurlod
Did you read my earlier post? If so, did you do what I asked? If so, what did you see?

Posted: Wed Aug 13, 2008 2:41 am
by snt_ds
Hi Ray,

We have gone through Data set management utility and the segments it is showing is '0'.

We tried keeping '/' but same error we are getting.

That too sometimes it is working and sometimes it is not working.

Can you please give some inputs to solve this.

Posted: Wed Aug 13, 2008 3:06 am
by ray.wurlod
You have to select one of the partitions for the Segments grid to be populated, and it is then populated with the pathnames of the segment files that are used on that partition.

Posted: Wed Aug 13, 2008 3:45 am
by snt_ds
ray,

we used hash and same partition earlier but same issue happening

Posted: Wed Aug 13, 2008 4:25 am
by ray.wurlod
What, that the Data Set has no segments? Or is it 2 segments? Or 1? I am getting thoroughly confused. What - precisely - is your issue?

And what happened (in the Segments grid) when you selected a partition in the Data Set Management utility?

Posted: Wed Aug 13, 2008 6:37 am
by snt_ds
Hi Ray,

We have been using the same job with different invocation id's, Few of invocations are running successfully and few of them are throwing warning like "expecting 4 segments, read 6 segments."

When we view the dataset management utility it is showing like below message

Exception_745

##I TFCN 000001 06:34:42(000) <main_program>
Ascential DataStage(tm) Enterprise Edition 7.5.1A
Copyright (c) 2004, 1997-2004 Ascential Software Corporation.
All Rights Reserved

##I TUTL 000031 06:34:42(001) <main_program> The open files limit is 1024; raising to 10000.
##I TFSC 000001 06:34:43(000) <main_program> APT configuration file: /tmp/aptoa3229983eb88
Name: //detld2/etl/GPA/WM/log/Exception_745.ds
Version: ORCHESTRATE V7.5.1 DM Block Format 6.
Time of Creation: 08/13/08 04:29:55
Number of Partitions: 2
Number of Segments: 4
Valid Segments: 4
Preserve Partitioning: false
Segment Creation Time:
0: 08/13/08 04:29:55
1: 08/13/08 04:30:53
2: 08/13/08 04:31:11
3: 08/13/08 04:31:31

Partition 0
node : node1
records: 0
blocks : 0
bytes : 0
files :
Segment 0 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0000.0000.0000.16ad.ca6be683.0000.7c0032cb 0 bytes

Segment 1 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0001.0000.0000.19c3.ca6be6bd.0000.0e68ce63 0 bytes

Segment 2 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0002.0000.0000.1a47.ca6be6cf.0000.0ec89603 0 bytes

Segment 3 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0003.0000.0000.1ad5.ca6be6e3.0000.1a982633 0 bytes

total : 0 bytes
Partition 1
node : node2
records: 0
blocks : 0
bytes : 0
files :
Segment 0 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0000.0001.0000.16ad.ca6be683.0001.5c422c40 0 bytes

Segment 1 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0001.0001.0000.19c3.ca6be6bd.0001.befe3f18 0 bytes

Segment 2 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0002.0001.0000.1a47.ca6be6cf.0001.2b0e79b8 0 bytes

Segment 3 :
/detld2/etl/ascential/scratch/datasets/Exception_745.ds.p475639.dasomg03.0003.0001.0000.1ad5.ca6be6e3.0001.7706b068 0 bytes

total : 0 bytes

Totals:
records : 0
blocks : 0
bytes : 0
filesize: 0
min part: 0
max part: 0

Please through some inputs on this.