Page 1 of 1

BeforeJob does not appear in job log

Posted: Wed Aug 29, 2012 7:08 am
by jinm
BeforeJob routine called DeleteHashFiles.

Same setup is used in multiple jobs, but in one job, the log does not show that the routine has been called.

It should look like this:
Occurred: 00:34:10 Event: Starting Job PSRP5011010%rel40.0.0. (...)
Occurred: 00:34:10 Event: Environment variable settings: (...)
Occurred: 00:34:10 Event: PSRP5011010%rel40.0.0: Set NLS locale (..)
Occurred: 00:34:10 Event: PSRP5011010%rel40.0.0..BeforeJobDeleteHashFile): PSRP5011010 (FOUND)
Occurred: 00:34:10 Event: PSRP5011010%rel40.0.0..lkp_02_Dim_Category.IDENT1: DSD.StageRun Active stage starting, tracemode = 0.

etc.

However in one job I only get this:

Occurred: 04:52:16 Event: Starting Job PSRP5102039%rel40.3.0. (...)
Occurred: 04:52:16 Event: Environment variable settings: (...)
Occurred: 04:52:16 Event: PSRP5102039%rel40.3.0: Set NLS locale to US-ENGLISH,US-ENGLISH,US-ENGLISH,US-ENGLISH,US-ENGLISH
Occurred: 04:52:16 Event: PSRP5102039%rel40.3.0..lkp_110_Hash_Date.IDENT4: DSD.StageRun Active stage starting, tracemode = 0.

completely skipping the BeforeJob routine..

The jobs are set identically etc.


Anybody seen anything like that, and how can I correct it ??

Posted: Wed Aug 29, 2012 7:38 am
by ArndW
I'd be surprised if the BeforeJob isn't called. Can you put "CALL DSLogWarn('Testing...','')" as the first line in the routine and see if that gets printed?

Posted: Thu Aug 30, 2012 12:09 am
by ray.wurlod
Can you check that released version 40.3.0 actually HAS a before-job subroutine?

Posted: Thu Aug 30, 2012 12:31 am
by jinm
It does - but I actually had the same thought - so no offence.
Looked into if the log entry only was visible if a hash file was found, but I get lines with (FOUND) as well as (NOT FOUND).

It HAS worked earlier in same release version of the job, som for some obscure reason, it is skipped now.

wondering if:
- too large a log file
- too large DS Project

could have an impact.
Job in release 40.3.0 is from july 13th 2012
and I have a log from aug 15th (latest) where the before job routine is visible in the log file.

@ arndW
Working on it, but since this is production for a company in pharma changing things are sometmes - hmmm difficult at best

Posted: Thu Aug 30, 2012 12:57 am
by ArndW
If you cannot change the job then look at the source code to see if there are any CALLs or other steps between the job beginning and the DSLogInfo() call that might trigger an abort of the before-job routine.

Posted: Fri Sep 07, 2012 7:45 am
by jinm
I have extracted DSX file for the job and searched (and searched again) and compared it another rather similar job to find any differences:
(first a job that actually works:

Code: Select all

BEGIN DSJOB
   Identifier "PSRP5083080"
   DateModified "2011-11-10"
   TimeModified "14.24.24"
   BEGIN DSRECORD
      Identifier "ROOT"
      DateModified "1899-12-30"
      TimeModified "00.00.01"
      OLEType "CJobDefn"
      Readonly "0"
      Name "PSRP5083080"
      Description "Building Master Fact for RightFirstTime Data."
      NextID "175"
      Container "V0"
      FullDescription =+=+=+=
Building Master Fact for RightFirstTime Data (PRODRFT).

Pivot many columns in one row to many rows.
=+=+=+=
      JobVersion "40.2.0"
      BeforeSubr "DSU.DeleteHashFiles\\PSRP5083030"
      ControlAfterSubr "0"
      Parameters "CParameters"
Now for the job that does NOT work the entries are the same:

Code: Select all

=+=+=+=
      JobVersion "40.4.0"
      BeforeSubr "DSU.DeleteHashFiles\\PSRP5102039"
      ControlAfterSubr "0"
      Parameters "CParameters"
However these entries are somewhat 350 lines further down in the dsx file, since initially a lot of definition for

Code: Select all

BEGIN DSJOB
   Identifier "PSRP5102039"
   DateModified "2012-07-12"
   TimeModified "10.27.00"
   BEGIN DSRECORD
      Identifier "C165"
      DateModified "1899-12-30"
      TimeModified "00.00.01"
[b]      OLEType "CContainerStage"[/b]
takes place first.
It looks like the shared container stages could be a problem perhaps?


Does it make any sense?

Posted: Fri Sep 07, 2012 9:44 am
by ArndW
The before-job gets called before any stages in the job, so container usage is unlikely to affect this problem.

I meant to look at the source code of your "DeleteHashFiles" routine - what happens between the start and the first DSLogInfo() call that might terminate it?

Posted: Wed Sep 12, 2012 1:11 am
by jinm
Soory if it looks a cnfusing (I have not done the coding myself)
However. Job name is PSRP5102039, and inputarg is the same, which means that we should be in the sectioin where
"* Check the Arg String and if the String is equal to Job Name the Hash Files are
* in Default format. <jobname>_HASHFile_<lbnr>
JobName = substrings(DSGetJobInfo(DSJ.ME,DSJ.JOBNAME),1,len(InputArg))"

ALSO I can find files PSRP51020_HashFile_1 (2 and 3 etc)
I believe there is something in the UPPER sectioin of the variables, where the filename does not correspond with actual name
(Routine expects HASHFile but findes HashFile???

Code: Select all

Deffun DSRMessage(A1, A2, A3) Calling "*DataStage*DSR_MESSAGE"
*Common /HashLookup/ FileHandles(100), FilesOpened
Common /HashLookup/ FileHandles
FileOpenStatus = 0

* If InputArg is empty, log a Warning-type message and return 
* ErrorCode = 1 (job will abort if called as Before routine).
  InputArg = Trim(InputArg)
  If InputArg = "" Then
     Message = DSRMessage("DSTAGE_TRX_E_0011", Message = "No command to execute.", "")
     GoTo ErrorExit
  End

* Else, just try to execute UniVerse command DELETE.FILE with string as parameter 
* TCL command, capturing all results.

* Check the Arg String and if the String is equal to Job Name the Hash Files are
* in Default format. <jobname>_HASHFile_<lbnr>
JobName = substrings(DSGetJobInfo(DSJ.ME,DSJ.JOBNAME),1,len(InputArg))


  If InputArg = JobName
  Then
     HashFileName = InputArg:'_HashFile_'
     HashFileNameUpper = InputArg:'_HASHFILE_'
     TypeOfHashName = 'DEFAULT'
  End Else
     HashFileName = Field(InputArg,',',1)
     TypeOfHashName = 'NOT DEFAULT'
  End

* Delete Hash Files.
* If default Hash File naming, the loop stop when analyze rutine can't find til Hash-file.
* If Hash File list in argument, all files are attempt deleted. 
  if TypeOfHashName = 'DEFAULT'
  Then
     for i=1 to 40 step 1
           HashFileNameLbnr = HashFileName:i
           HashFileNameLbnrUpper = HashFileNameUpper:i
           Open HashFileNameLbnr To FileHandles 
           Then
              Message = HashFileNameLbnr:' (FOUND)'
              Call DSLogInfo(Message , RoutineName)

Posted: Wed Sep 12, 2012 2:40 am
by ArndW
If you change your code to read

Code: Select all

Deffun DSRMessage(A1, A2, A3) Calling "*DataStage*DSR_MESSAGE"
*Common /HashLookup/ FileHandles(100), FilesOpened
Common /HashLookup/ FileHandles
FileOpenStatus = 0

CALL DSLogWarn('--> At beginning of BeforeJobRoutin <--','')

* If InputArg is empty, log a Warning-type message and return 
* ErrorCode = 1 (job will abort if called as Before routine).
  InputArg = Trim(InputArg)
  If InputArg = "" Then
     Message = DSRMessage("DSTAGE_TRX_E_0011", Message = "No command to execute.", "")
     GoTo ErrorExit
  End
...
Then compile it and run the erroneous job you will be able to see if the Before-Job routine is being called or not since you will get a log warning entry.

p.s. I can't recall if the line "$INCLUDE dsinclude JOBCONTROL.H" is required here. If the compiler gives an error about the array DSLogWarning not being declared, then add this include file.

Posted: Wed Sep 12, 2012 3:58 am
by jinm
will try and get back
THX