A developer at a client site has approached me with a strange problem which I thought I'd post here...
He's developed a very complex job with roughly 100 stages that is intermittently reporting the following error:
Jobname..PreSort_020.ToSort_020: DSD.SEQOpen Failed to open filename.txt in directory H:\Temp
STATUS = 13.
This error appears to be related to the read from a sequential file stage that is both written to and read from within the same job. However, when we look in the directory specified by the error message, the file does exist. The stage that follows the sequential file stage is a transformer with a before stage sub-routine that executes a DOS command line sort reading from the sequential file.
Any ideas why this might be happening?
DSD.SEQOpen Failed
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 1255
- Joined: Wed Feb 02, 2005 11:54 am
- Location: United States of America
This forum has a search facility as well. I did a search for you and came up with this LINK.
Hope that helps,
Thanks,
Naveen.
Hope that helps,
Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
Thanks for the response Naveen.
The link you mentioned suggests that two separate jobs be used rather than attempting to read from and write to the same file in the same job. However, I've seen many jobs at numerous sites read from and write to the same file in a single job without a problem, and it's interesting that this error is popping up in a job with a high level of complexity (i.e. many parallel streams writing concurrently to many files). Also, would the suggested change apply to the use of hash files? It's very common for me to see jobs in which a hash file is populated in the same job in which it is referenced.
I'm hesitant to rule out a useful function that is allowed by the tool until I better understand the root cause of the problem. More specifically, I'm curious to know if other users of the product are aware of any limitations or "best practices" governing reads and writes to the same file within a single job, as illustrated below:
... Transformer --> Sequential File --> Transformer ...
Thanks!
Peter
The link you mentioned suggests that two separate jobs be used rather than attempting to read from and write to the same file in the same job. However, I've seen many jobs at numerous sites read from and write to the same file in a single job without a problem, and it's interesting that this error is popping up in a job with a high level of complexity (i.e. many parallel streams writing concurrently to many files). Also, would the suggested change apply to the use of hash files? It's very common for me to see jobs in which a hash file is populated in the same job in which it is referenced.
I'm hesitant to rule out a useful function that is allowed by the tool until I better understand the root cause of the problem. More specifically, I'm curious to know if other users of the product are aware of any limitations or "best practices" governing reads and writes to the same file within a single job, as illustrated below:
... Transformer --> Sequential File --> Transformer ...
Thanks!
Peter
The other post is specific to the problem of attempting to read and write to a sequential file at the same time - not in the same job, which isn't the same thing. Your design as posted is fine, the writer link will create / write to the sequential file, finish and close it before the reader link opens the file and begins to read it. Assuming both links refer to the same filename, of course - which isn't a requirement.
Your problem sounds more resource related to me.
Your problem sounds more resource related to me.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Hashed files are not the same as sequential files in these regards. It's perfectly fine to read and write to the same file in the same job.patonp wrote:Also, would the suggested change apply to the use of hash files? It's very common for me to see jobs in which a hash file is populated in the same job in which it is referenced.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle