DSD.SEQOpen Failed

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
patonp
Premium Member
Premium Member
Posts: 110
Joined: Thu Mar 11, 2004 7:59 am
Location: Toronto, ON

DSD.SEQOpen Failed

Post by patonp »

A developer at a client site has approached me with a strange problem which I thought I'd post here...

He's developed a very complex job with roughly 100 stages that is intermittently reporting the following error:

Jobname..PreSort_020.ToSort_020: DSD.SEQOpen Failed to open filename.txt in directory H:\Temp
STATUS = 13.

This error appears to be related to the read from a sequential file stage that is both written to and read from within the same job. However, when we look in the directory specified by the error message, the file does exist. The stage that follows the sequential file stage is a transformer with a before stage sub-routine that executes a DOS command line sort reading from the sequential file.

Any ideas why this might be happening?
I_Server_Whale
Premium Member
Premium Member
Posts: 1255
Joined: Wed Feb 02, 2005 11:54 am
Location: United States of America

Post by I_Server_Whale »

This forum has a search facility as well. I did a search for you and came up with this LINK.

Hope that helps,

Thanks,
Naveen.
Anything that won't sell, I don't want to invent. Its sale is proof of utility, and utility is success.
Author: Thomas A. Edison 1847-1931, American Inventor, Entrepreneur, Founder of GE
patonp
Premium Member
Premium Member
Posts: 110
Joined: Thu Mar 11, 2004 7:59 am
Location: Toronto, ON

Post by patonp »

Thanks for the response Naveen.

The link you mentioned suggests that two separate jobs be used rather than attempting to read from and write to the same file in the same job. However, I've seen many jobs at numerous sites read from and write to the same file in a single job without a problem, and it's interesting that this error is popping up in a job with a high level of complexity (i.e. many parallel streams writing concurrently to many files). Also, would the suggested change apply to the use of hash files? It's very common for me to see jobs in which a hash file is populated in the same job in which it is referenced.

I'm hesitant to rule out a useful function that is allowed by the tool until I better understand the root cause of the problem. More specifically, I'm curious to know if other users of the product are aware of any limitations or "best practices" governing reads and writes to the same file within a single job, as illustrated below:

... Transformer --> Sequential File --> Transformer ...

Thanks!

Peter
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

The other post is specific to the problem of attempting to read and write to a sequential file at the same time - not in the same job, which isn't the same thing. Your design as posted is fine, the writer link will create / write to the sequential file, finish and close it before the reader link opens the file and begins to read it. Assuming both links refer to the same filename, of course - which isn't a requirement. :wink:

Your problem sounds more resource related to me. :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

patonp wrote:Also, would the suggested change apply to the use of hash files? It's very common for me to see jobs in which a hash file is populated in the same job in which it is referenced.
Hashed files are not the same as sequential files in these regards. It's perfectly fine to read and write to the same file in the same job.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Post Reply