Page 1 of 1

Viewing But Not Reading

Posted: Wed Oct 25, 2006 11:16 am
by asitagrawal
Hi, In my job, when, on the source side I click 'View Data...', then some 19 - 20 rows of data is being displayed.
But, when I run the job, 0 (ZERO) rows are being read!!


Plz explain.

Posted: Wed Oct 25, 2006 11:32 am
by ray.wurlod
Not enough information. What other stages are in your job? Are you using a filter command? Does the job execute using the same user ID as you use when you view data? (Have you checked that?)

Posted: Wed Oct 25, 2006 11:36 am
by asitagrawal
This is the desgin of my Job:

DB2 Stage
|
|
V
Transformer ------> SeqFile
|
|
V
Shared Container

The DB2 Stage has a join from 3 tables, for which the output happens to be some 19 rows. This I verfied by running the querry from DB2 Command Editor and also by clicking on 'View Data'.

Now when I run the job, there are 0 rows read , 0 rows written to SeqFiles.
There are no constraints used in the job.

Posted: Wed Oct 25, 2006 11:40 am
by ray.wurlod
No WHERE clause in the DB2 stage? Please check the generated SQL (in the job log).

Posted: Wed Oct 25, 2006 11:43 am
by asitagrawal
ray.wurlod wrote:No WHERE clause in the DB2 stage? Please check the generated SQL (in the job log). ...

Thsi is the FROM Clause:

EPM1PA.PS_JRNL_HEADER H,EPM1PA.PS_JRNL_LN LN,EPM1PA.PS_MB_ASCNTLDTTM DT,EPM1PA.PS_MB_JRNL_LN_SIS SIS


This is the WHERE clause....

DT.MB_ASCNTLJOBNAME ='#JOBNAME#'
AND H.JRNL_HDR_STATUS = 'P'
AND
H.POSTED_DATE >= DATE(DT.MB_JOBDTTM) AND
H.POSTED_DATE <= '#INPUTDATE#'
AND
H.BUSINESS_UNIT = LN.BUSINESS_UNIT
AND
LN.BUSINESS_UNIT = SIS.BUSINESS_UNIT
AND
H.JOURNAL_ID = LN.JOURNAL_ID
AND
LN.JOURNAL_ID = SIS.JOURNAL_ID
AND
H.JOURNAL_DATE = LN.JOURNAL_DATE
AND
LN.JOURNAL_DATE = SIS.JOURNAL_DATE
AND
H.UNPOST_SEQ = LN.UNPOST_SEQ
AND
LN.UNPOST_SEQ = SIS.UNPOST_SEQ
AND
LN.JOURNAL_LINE = SIS.JOURNAL_LINE
AND
LN.LEDGER = SIS.LEDGER;

Posted: Wed Oct 25, 2006 11:44 am
by DeepakCorning
Also copy paste the SQL in the SQL tool you have and run it and see if it returns any rows or not.

Posted: Wed Oct 25, 2006 11:49 am
by asitagrawal
DeepakCorning wrote:Also copy paste the SQL in the SQL tool you have and run it and see if it returns any rows or not.
When I run the SQl, I get the desired result.
Also, when I do 'View Data' from the stage, I can see the desired rows being returned....

BUT
When I run the job,
0 rows read from PS_JRNL_HEADER_Link (DB2 to Transformer Link)
0 rows written to GL201_EPM_Intfc (Trans to SeqFile Link)
0 rows written to UpdateDttm (Trans to Sahred Container Link).

Posted: Wed Oct 25, 2006 12:25 pm
by DeepakCorning
PS_JRNL_HEADER - Sounds like a Peopleosft Functionality. Does your job has a input link and output link from the table (DRS) to a shared container? Are you viewing data on the output link from the DRS?? As far as i know the shared container which you are talking about should get only one rows (Max Updated Time) and but you said the qury that you are running is giving you 19 rows??

Posted: Wed Oct 25, 2006 12:44 pm
by narasimha
Do you have any warning/errors messages in the job logs?

Posted: Wed Oct 25, 2006 12:46 pm
by asitagrawal
DeepakCorning wrote:PS_JRNL_HEADER - Sounds like a Peopleosft Functionality. Does your job has a input link and output link from the table (DRS) to a shared container? Are you viewing data on the output link from the DRS?? As far as i know the shared container which you are talking about should get only one rows (Max Updated Time) and but you said the qury that you are running is giving you 19 rows??
No worries for the shared container part...I am worried abt the
DB2-----> Trans -----> SeqFile links. Why no data is flowing on that???

No warning and errors!

Posted: Wed Oct 25, 2006 1:40 pm
by DeepakCorning
So when you see the log while the query which runs when u trigger the job is the same what you are running in the Database??

Also (i know sounds silly) but check if you are connecting to the correct DBs while running the job (the same DB where you are runing the query and gettign results).