Append to DataSet - expecting 3 segments, read 4 segments
Moderators: chulett, rschirm, roy
Append to DataSet - expecting 3 segments, read 4 segments
Hi All,
Has anyone encountered similar warning message like:
opt/bi/datasets/bloomberg/bb_bond_price_FromFile.ds, expecting 2 segments, read 3 segments.
I have 6 instances appending to the same dataset concurrently, they finished ok with the above warnings. All the warning message states the same content, but the counting is different as following:
expecting 3 segments, read 4 segments
expecting 5 segments, read 6 segments...etc.
Any advises except Message Handler to avoid the warning is appreciated!
Thanks!
Has anyone encountered similar warning message like:
opt/bi/datasets/bloomberg/bb_bond_price_FromFile.ds, expecting 2 segments, read 3 segments.
I have 6 instances appending to the same dataset concurrently, they finished ok with the above warnings. All the warning message states the same content, but the counting is different as following:
expecting 3 segments, read 4 segments
expecting 5 segments, read 6 segments...etc.
Any advises except Message Handler to avoid the warning is appreciated!
Thanks!
Pneuma Lin.
pneumalin@yahoo.com
pneumalin@yahoo.com
The Job design is very simple. It only contains 3 stages including SourceFile, Transformer and Target DataSet. The job simply uses "append" since it will be called by 6 instances using different source files and appending to the same dataset.
Pneuma Lin.
pneumalin@yahoo.com
pneumalin@yahoo.com
Arnold,
I know someone will ask this question, fotgot to mention that in previous update.
I used 1 node only!
Thanks!
I know someone will ask this question, fotgot to mention that in previous update.
I used 1 node only!
Thanks!
Pneuma Lin.
pneumalin@yahoo.com
pneumalin@yahoo.com
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Only one partition# 0 showed up and the NODE name is node1.
We just use the default.apt that is configured to one node only.
We just use the default.apt that is configured to one node only.
Pneuma Lin.
pneumalin@yahoo.com
pneumalin@yahoo.com
I am wondering, how it allowed you to write to a dataset in paralle. And not gave out all those errors that I got when I did the same.
Why cant you serialize the flow, unless requiered to write in parallel otherwise?
Why cant you serialize the flow, unless requiered to write in parallel otherwise?
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
Kumar,
We have to schedule the 6 instances to wait for the files arrival in a certain period, and we don't know when exatly the file will arrive. That's why we have to run them in parallel.
Thanks for your advise.
We have to schedule the 6 instances to wait for the files arrival in a certain period, and we don't know when exatly the file will arrive. That's why we have to run them in parallel.
Thanks for your advise.
Pneuma Lin.
pneumalin@yahoo.com
pneumalin@yahoo.com