Problem with accessing sequential file in a shared container

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
Jdrost
Participant
Posts: 17
Joined: Wed Jun 28, 2006 3:32 am

Problem with accessing sequential file in a shared container

Post by Jdrost »

Hi,

I have developed a shared container to add error-messages to a error-log (sequential-file). The name of the file is hard-coded. I use this container in several (upto 20) server-jobs. When I run the sequence with all these server-jobs running parallel at a certain moment the job fails with the following message:

'Inflow_DB_DBSTP_clean..AddErrorLog.ERROR_LOG.ERROR_LOG_OUT: DSD.SEQOpen Unable to create file C:\LPGP_D_CRISKM_PROJECT\ErrorLog.csv.'

I think this error is a result of a job trying to access the error-file when it is locked for update by another job (is it?). The only solution to solve this problem I can think of is to execute the jobs in sequence instead of in parallel.

Does anyone know a better solution for this problem?
Kind regards,

Johannes Drost
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Two or more processes cannot WRITE to a sequential file simultaneously. Consider using a parameter in the file name.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Post by DeepakCorning »

If you are simultaneously triggering two jobs with the shared container in it then it will give the same error as they can write to the seq file at the same time. Try using Hashed File.
Jdrost
Participant
Posts: 17
Joined: Wed Jun 28, 2006 3:32 am

Post by Jdrost »

DeepakCorning wrote:If you are simultaneously triggering two jobs with the shared container in it then it will give the same error as they can write to the seq file at the same time. Try using Hashed File.
Why use a hash-file? Is it possible to read and write simultaneously from multiple jobs to a hash-file?
Kind regards,

Johannes Drost
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Hashed files can handle multiple processes reading/writing, as long as they are not accessing the same row. Avoid using the locking mechanisms as it is a bad technique.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
DeepakCorning
Premium Member
Premium Member
Posts: 503
Joined: Wed Jun 29, 2005 8:14 am

Post by DeepakCorning »

Yes , Hashed File can be written by many jobs and hence are a good replacement for seq files in a shared container.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

A Hashed File is simply another mechanism for implementing a database table. So imagine that you're inserting new rows into a database table.

Create the hashed file with a UV stage but edit the DDL so that you get auto-generated key values.

Code: Select all

CREATE TABLE ErrorTable (
   SurrKey INTEGER NOT NULL DEFAULT NEXT AVAILABLE,
   ErrCode VARCHAR(20) NOT NULL DEFAULT '',
   ErrText VARCHAR(254) NOT NULL DEFAULT '',
   JobName VARCHAR(254) NOT NULL DEFAULT '',
   JobStartTimeStamp CHAR(19)
);
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply