After Stage Subroutine
Moderators: chulett, rschirm, roy
After Stage Subroutine
Hi,
We are in the process of migrating DS 5.1 to 7.5. We have 5.1 jobs on one server and 7.5 jobs on the other server. When I process the data, it should create an .Out file on the server. However the file would be erased on the server, when there is no data in it. The "Erase" process is done through an after-Stage Subroutine based on one of the common variable value. If the value is Zero, it will delete the file else keeps it.
When I process the same set of data by both the versions DS jobs, the .out file is created by Version 5.1 jobs but not by Version 7.5 jobs. I can see the data passing to the .out file 9wrote to a sequential file), but it' deleted by the end of the job run.
Any ideas?? Appreciate your help.
Thanks
We are in the process of migrating DS 5.1 to 7.5. We have 5.1 jobs on one server and 7.5 jobs on the other server. When I process the data, it should create an .Out file on the server. However the file would be erased on the server, when there is no data in it. The "Erase" process is done through an after-Stage Subroutine based on one of the common variable value. If the value is Zero, it will delete the file else keeps it.
When I process the same set of data by both the versions DS jobs, the .out file is created by Version 5.1 jobs but not by Version 7.5 jobs. I can see the data passing to the .out file 9wrote to a sequential file), but it' deleted by the end of the job run.
Any ideas?? Appreciate your help.
Thanks
dsx
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Ok. Job has the stages in the following sequence:
ODBC stage (O1) --> Transf Stage (T1) --> Seq.File Stage (S1) --> Transf Stage (T2) --> ODBC Stage (O2)
In stage T1, in the constraints section, included a routine "CountRowOut" to count the number of rows passing to the sequential file in S1. following is the code of this routine. It has two i/p parameters Command, OutputIndicator
Thanks
ODBC stage (O1) --> Transf Stage (T1) --> Seq.File Stage (S1) --> Transf Stage (T2) --> ODBC Stage (O2)
In stage T1, in the constraints section, included a routine "CountRowOut" to count the number of rows passing to the sequential file in S1. following is the code of this routine. It has two i/p parameters Command, OutputIndicator
In Transformer Stage T2, I have a routine "CleanupFileOut", which is meant for deleting the file if its empty. Code of the routine is as below. In this routine, it is checking the common variable "FileOutRows" against zero. I assume this comparison is failing, as it is initially writing the data to file, but by end of the job run, its deleting the file.COMMON /CountFileOut/ Status, Reason, FileOutRows
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!
! COUNT FILE OUT
! - PERFORM FILE OUT ROW COUNT TO DETERMINE WHETHER TO KEEP FILE OUT FILE
!
! Command ==> 1 ==> set common and return status
! 2 ==> set common and return reason
! 3 ==> return status
! 4 ==> return reason
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
! COMMANDS 1 AND 2 WILL PERFORM THE REQUIRED VALIDATIONS AND SET THE VALUE OF THE COMMON
! AREA VARIABLES - REASON (CONTAINS ERROR MSGS) AND STATUS ('TRUE' = ERRORS WERE FOUND).
! COMMANDS 3 AND 4 WILL ONLY ACCESS THE VALUES ALREADY SET IN THE COMMON AREA VARIABLES
! AND RETURN THE REQUESTED VALUE.
If Command = 1 Or Command = 2 Then
Reason = ""
Status = @TRUE
! COUNT OUT FILE ROWS
If IsNull(OutputIndicator) Then
Status = @FALSE
End
Reason = Trim(Reason)
If Status Then
FileOutRows += 1
End
End
Begin Case
Case Command = 1
Ans = Status
Case Command = 2
Ans = Reason
Case Command = 3
Ans = Status
Case Command = 4
Ans = Reason
End Case
This routine is working well in DS 5.1 but not in 7.5. Appreciate your help.COMMON /CountFileOut/ Status, Reason, FileOutRows
! CLEAN UP FILE OUT FILE
! IF NO RECORDS POSTED TO FILE, DELETE 'FILE OUT'- FILE)
ErrorCode = 0 ; * set this to non-zero to stop the stage/job
DirName = Field(InputArg,"~",1)
FileName = Field(InputArg,"~",2)
dttm = Field(InputArg,"~",3)
If FileOutRows = 0 Then
Command = 'Erase "':DirName:'\':FileName :'.':dttm:'.fileout.csv"'
End
Call DSExecute("NT", Command, Output, SystemReturnCode)
Thanks
dsx
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I can not see in CountRowOut where the row count is calculated nor where Reason is set to anything but "". On that basis, FileOutRows seems never to be set and would therefore always be 0 when accessed from CleanupFileOut. Therefore the Erase command will never be executed.
Check that your routine code for both routines is exactly the same on the version 5 system and the version 7 system.
Check that your routine code for both routines is exactly the same on the version 5 system and the version 7 system.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Wrapping your code in Code tags (to preserve the indenting) would have made that easier to see, though I admit fault for missing it. How about modifying the after-stage subroutine with some diagnostic statements so that you can see exactly what's happening?
What is the value of Command if FileOutRows is non-zero? You appear to invoke DSExecute outside the scope of the If test.
What is the value of Command if FileOutRows is non-zero? You appear to invoke DSExecute outside the scope of the If test.
Code: Select all
$DEFINE DEBUGGING
If FileOutRows = 0 Then
Command = 'Erase "':DirName:'\':FileName :'.':dttm:'.fileout.csv"'
$IFDEF DEBUGGING
Msg = "FileOutRows is zero, Command is " : Command
Call DSLogInfo(Msg, "Debugging")
$ENDIF
End
Else
$IFDEF DEBUGGING
Msg = "FileOutRows is " : FileOutRows : ", Command is "
If UnAssigned(Command)
Then Msg := "unassigned."
Else If IsNull(Command) Then Msg := "null." Else Msg := Command
Call DSLogInfo(Msg, "Debugging")
$ENDIF
End
Call DSExecute("NT", Command, Output, SystemReturnCode)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
When I implemented the above mentioned logic in the CleanupFileOut routine, message logged was "(Debugging): FileOutRows is zero, Command is Erase "d:\smdpi\outbox\73434.20080112_154657.fileout.csv". However, when I slightly modified the code for CountFileOut routine to include log message as below
the message logged has "(Debugging): FileOutRows is 1". Similarly 29 more msgs were logged, which have FileOutRows value from 2 to 30. I assume
CleanupFileOut routine is unable to access the FileOutRows common variable value.
Code: Select all
! COUNT OUT FILE ROWS
If IsNull(OutputIndicator) Then
Status = @FALSE
End
Reason = Trim(Reason)
If Status Then
FileOutRows += 1
$DEFINE DEBUGGING
$IFDEF DEBUGGING
Msg = "FileOutRows is " : FileOutRows
Call DSLogInfo(Msg, "Debugging")
$ENDIF
End
CleanupFileOut routine is unable to access the FileOutRows common variable value.
dsx
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
If CleanupFileOut is an after stage subroutine with a compatible COMMON declaration, then it's in the same process and should be able to access the variables declared to be in that COMMON area.
If CleanupFileOut is an after job subroutine then it's in a different process and would not be able to access the variables in the stage's COMMON area.
According to your original post the first scenario applies, so I am at a loss to explain what's going on here.
It's tedious, I know, but can you put in some debugging statements to show us what's happening with the Reason and Status variables?
Using a substitution conversion is a cute way of handling the possibility of NULL.
If CleanupFileOut is an after job subroutine then it's in a different process and would not be able to access the variables in the stage's COMMON area.
According to your original post the first scenario applies, so I am at a loss to explain what's going on here.
It's tedious, I know, but can you put in some debugging statements to show us what's happening with the Reason and Status variables?
Code: Select all
$IFDEF DEBUGGING
Message = "Intermediate variables"
Message<-1> = "Reason = " : Oconv(Reason, "S*;*;'NULL'")
Message<-1> = "Status = " : Oconv(Status, "S*;*;'NULL'")
Call DSLogInfo(Message, "Debugging")
$ENDIF
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
When I included the given debugging statements, I found 30 rows logged with
CountFileOut is a Transform Function and is called in the constraints section of the Transformer stage, with input parameter "Command" value as 1 and with the OutputIndicator value coming from the previous ODBC stage. And CleanupFileOut is an After-Stage Routine only. From the log, I can very clearly see that "(Debugging): FileOutRows is zero, Command is Erase d:\smdpi\outbox\73435.20080115_102657.fileout.csv".
Also, fyi, there is no change in common declaration in both the function and after-stage routine.
Code: Select all
Intermediate variables
Reason =
Status =
Also, fyi, there is no change in common declaration in both the function and after-stage routine.
dsx
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
If Status is empty then the row count will not be incremented, according to your logic. Add a debugging statement in the after-stage subroutine to report the values of the three COMMON variables immediately on entry.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Wait a second - you have two Transformers and are attempting to share the same COMMON area between them? And, there's a passive stage between the two? I don't think this works anymore after 5.1 because of the introduction of row-buffering and inter-processing options.
I believe that the two Transformers are totally separate processes now, but you need to verify this. The easiest way is to go to Job Properties and look at the Performance tab. Disable "Use Project Defaults" as this may be to use Inter-process buffering which defeats COMMONs. Make sure nothing is checked and try your job again.
I believe that the two Transformers are totally separate processes now, but you need to verify this. The easiest way is to go to Job Properties and look at the Performance tab. Disable "Use Project Defaults" as this may be to use Inter-process buffering which defeats COMMONs. Make sure nothing is checked and try your job again.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I understood that everything was in T2, which would discount that argument. If my assumption is wrong, then I'm with Ken - COMMON is unique to one process - with an intermediate passive stage or with inter-process row buffering you force the active stages to run in separate processes, so the COMMON area of one would not be accessible to the other.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Yes. We have two transformer stages, and are connected through a Passive stage. Stages order looks as below.Wait a second - you have two Transformers and are attempting to share the same COMMON area between them? And, there's a passive stage between the two? I don't think this works anymore after 5.1 because of the introduction of row-buffering and inter-processing options
ODBC stage (O1) --> Transf Stage (T1) --> Seq.File Stage (S1) --> Transf Stage (T2) --> ODBC Stage (O2)
In T1, in constraints section, I am calling the Transform Function "CountFileOut". Here, I am passing 1 and a value from ODBC stage. In T2, "CleanupFileOut" routine is called as "After-Stage"
"Use Project Defaults" was checked, I unchecked it and no other options in this tab are checked. Still the file is erased at the end of the job. I could see the file written to the server during the job run. I think it's the issue accessing the COMMON variable.Disable "Use Project Defaults" as this may be to use Inter-process buffering which defeats COMMONs. Make sure nothing is checked and try your job again.
dsx
Yeah, I suspected as much. You're going to have to use a file to transport data between your two Transformers if you want a low impact change. Have your first routine write out a file with the row count and have the other read the file.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
With the intermediate Sequential File stage that shouldn't have worked in 5.1 either, unless (for whatever reason) DataStage allocated the same process to execute T2 as it had used to execute T1. No way to check, alas, no longer have a 5.1 installation available.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.