Strange Problem

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
rasi
Participant
Posts: 464
Joined: Fri Oct 25, 2002 1:33 am
Location: Australia, Sydney

Strange Problem

Post by rasi »

Hi All,

I had found a strange problem while running the Datastage Job in Ver 6.0.1.5. When I ran the job for Jan 2004 which is dec 2003 data it picks up the measurement Period from the header record. The total records was 1.6 Million and it ran fine and produced the target table with Dec 2003 data.

But we faced the problem when we re-run the same job again without any modification. The source file was same no changes to the source code. The output date was Sep 2003 which was derived from the header record of the source file. I have no idea of how datastage picks up Sep 2003 date from the header record which says Dec 2003. Also it only processed 1.5 million record and said it finished. While when I gave the record count it was 1.6 Million.

Even in our last month ran we had the same problem and I re-compiled the job then it was fine. I did the same this month I re-complied and ran the job it was fine.

Did anyone came across these problem.

Cheers
Rasi
raju_chvr
Premium Member
Premium Member
Posts: 165
Joined: Sat Sep 27, 2003 9:19 am
Location: USA

Re: Strange Problem

Post by raju_chvr »

Do you have any hash files in the job? Are you clearing up the hash file before writing into it ?
peterbaun
Premium Member
Premium Member
Posts: 93
Joined: Thu Jul 10, 2003 5:27 am
Location: Denmark
Contact:

Post by peterbaun »

Hi -

Sounds to me that you actually don't read the file you expect. I would double check that. Provided it isn't the correct file that you read then try to execute a "head 1 <possible-filenames>" (forgive me if the syntax is not correct) to find out which file you actually read from.

Regards
Peter
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Computers are dumb.
They do exactly what they're told.
Whatever it is you've told the computer to do (via the program generated by compiling your DataStage components), has been done.
You now need to investigate the components in very fine detail, to determine exactly what the program has asked the computer to do.
Troubleshooting techniques that may help include:
  • extra output columns (to view the results of function calls)
    debugger (to view them at human speed)
    processing only a sample of rows ("stop after N rows")
    stage tracing (same as debugger, but capturing output to file)
    extra, possibly conditionally compiled, statements in routines to report what's happening
    test grid for routines (don't forget to double click on Results cell to see whether anything was logged)
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Teej
Participant
Posts: 677
Joined: Fri Aug 08, 2003 9:26 am
Location: USA

Re: Strange Problem

Post by Teej »

rasi wrote:But we faced the problem when we re-run the same job again without any modification.
Are you saying that you are using the exact same filename for your source each and every time you run this program? Or are you actually using a Job Parameter to define the name and/or path of the file?
I did the same this month I re-complied and ran the job it was fine.
Are you able to experience the same problem on your test platform? Are you able to repeat the same problem using a small amount of data?
"head 1 <possible-filenames>"
Actually head -[number of rows] [file] which is obsolete.

Code: Select all

  head [-c bytes] [-n lines] [file...]

  Obsolescent Syntax

  head [-lines] [file...]
-T.J.
Developer of DataStage Parallel Engine (Orchestrate).
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
As Ray Said computers are dumb and do what they were "told" nothing more and nothing less.
So you need to check were you planted some code that might cause this to happen.
most things were mentioned allready and I would check for before/after routines that might be hiding in jobs and or transformer stages if any exists (I find it helfull to visually mark theese with Annotations).
check what the job does with processed files, does it move/rename them somewhere?
if your running several jobs check that they don't use, by mistake, the same file names as source/target for some data.

by the way it would help to cleanup the directories you use before you debug the process, making sure you start a fresh clean run with no previous run's impications except what you need for this job and examine the file/s it uses before running it.
IHTH
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
rasi
Participant
Posts: 464
Joined: Fri Oct 25, 2002 1:33 am
Location: Australia, Sydney

Post by rasi »

Hi All,

Sorry for getting back to this forum late.

In reply to all the posters.

Raju_Chvr: We do have hash files but it is created as target not as source.

pterbaun : I had checked the file which I was reading it is the same file which I want to read. I had even did the head check and viewed it.

Ray.wurlod : I had checked this by testing sample 1000 rows but still gives me the old date. But when I compile the same job and run it gives me the correct result. I don't really understand this strange behaviour.

Teej : I am using the job parameter for the path name. The funny thing is when I view the file I can see the exact file which I want to with the right values. But when I run the job it gives me wrong date.

Thanks guys for replying but still I haven't found out why it happens like this.

Rasi
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Just a thought, your parameter path the file, are you using the default value saved in the job design? There's three ways to NOT have the right value if you assume the default:

1. Running the job from Director. Director has the ability to store an overriding default, without changing the job design information, or the compiled information.
2. Running the job from Designer, then you have to supply the default. From Designer you get the saved default, not the compiled default. Debugger always gives you the saved default, but you can override it at runtime and it "remembers" any changes as long as you have the job open in Designer.
3. Running the job from dsjob, then you have to supply the default if you callout the parameter. From dsjob you get the compiled default, no matter what is saved in the job design and what is in Director.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Pretty sure you are wrong on #3, Ken. From 'dsjob' it would take whatever overriding default (if any) you've established via the Director (assuming you don't supply a value) and so should behave similar to #1.

Not in a position to double-check right now, however - big snow blew thru here yesterday and between it and some other things, I won't make it into work until around lunch time... unless someone else can verify before then?
-craig

"You can never have too many knives" -- Logan Nine Fingers
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

chulett wrote:Pretty sure you are wrong on #3, Ken. From 'dsjob' it would take whatever overriding default (if any) you've established via the Director (assuming you don't supply a value) and so should behave similar to #1.
Could be. I NEVER use defaults, especially Director set ones. There's no record of someone setting these, and importing the job anew or compiling or doing other things have the potential to lose this value. I consider this a development aid, but never a production practice. If you're right, please let me know, but I stated my understanding. I'm known for being wrong sometimes. :oops:
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

kcbland wrote:If you're right, please let me know, but I stated my understanding. I'm known for being wrong sometimes.
As hard as this may be to fathom, Ken, you are wrong on this one. :wink: The 'dsjob' command will use any default overrides you've established via the Director.

Not saying it's necessarily a good practice 'cuz (as you mentioned) importing or compiling the job will reset these back to their defaults and you could get caught with your parameters down around your ankles if you are not careful.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

Aaarrrghhh :oops:
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
Post Reply