Job Freeze..

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Job Freeze..

Post by swades »

Hi All,
I am running server job and my job is freezing at certain point,
can somebody tell me why this happen,
i got this message in log file,

CopyOfJ_NTWX_FACT_ORDER_DETAILS..LC_MERGE_ROWS: DSD.StageRun Active stage starting, tracemode = 0.

Thnaks.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

What do you mean freezing? Is the designer client freezing or the job is hanging? What additional messages do you get in the log file? Reset the job and see if you have any messages "From previous run ..."
Need more info on design as well.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Post by swades »

I reset and run this job 3 times,
Even i did not get any warning or fatal errors,
still my job stops at this point.
i dont have any other information.
Thanks.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Then how do you expect us to help you? No info of even the design, what your trying to do in the job, the data volume etc etc etc :?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Post by swades »

In out job 18 hash file we merge to a single hash file ,so i think problem with hash file reading or link collector stage is slow,
we collect through "Round Robin" type.
so help me Thanks.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

What happens when you write to the same hashed file from these 18 hashed files without the link collectory?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Post by swades »

Hi,
I did not try for that,
actually this job is design by another person,
and i cannt change in it,so here the design,

18 hash files----> link collector---->a single hash file.

my job ,every time freeze at this point and its shows blue link and there is not a single warning or fatal error.

so reply me thanks.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Dont change it. Just build another job to do the same except, without the link collector. This is just for debugging purposes.
I have had issues with link collectors and hashed files (connected together) before and therefore dont like them next to each other.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Post by swades »

DSguru2B wrote:What happens when you write to the same hashed file from these 18 hashed files without the link collectory?
how could i do this job?
swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Post by swades »

swades wrote:
DSguru2B wrote:What happens when you write to the same hashed file from these 18 hashed files without the link collectory?
how could i do this job?
i mean with out collector stage how can i merge 18 files.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Multiple writes to a hashed file is possible. Let your source hashed files go to either
- a single hashed file stage with the same hashed file name for each input
or
- 18 hashed file stages with the same name.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
swades
Premium Member
Premium Member
Posts: 323
Joined: Mon Dec 04, 2006 11:52 pm

Post by swades »

DSguru,
but single Hash file reads "Round-Robin"(sequentially) or it just append all hash file,if it append all hash file then ideally it taking more time,because it wait for all file to read.
so answer me,actually my problem was solve.
Thanks.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

No my friend, it will do simultaneous reads from all of your source hashed files and load to a single hashed file.
How did you fix your problem? What was the issue?
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
kumar_s
Charter Member
Charter Member
Posts: 5245
Joined: Thu Jun 16, 2005 11:00 pm

Post by kumar_s »

Hope your design is many to one. I mean many Hashed file to to single Hashed file. I also assume the output file is different from the source files.
That case, if any one source has large data to be transfered, the job takes that long.
And by the way, what gets your solution.
And you can mark the topic are resolved.
Impossible doesn't mean 'it is not possible' actually means... 'NOBODY HAS DONE IT SO FAR'
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

A hashed file is a way to implement a database table. You can have multiple streams inserting rows into it.

There is no need at all for a Link Collector stage.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply