Page 1 of 2

Job Freeze..

Posted: Mon Mar 12, 2007 12:43 pm
by swades
Hi All,
I am running server job and my job is freezing at certain point,
can somebody tell me why this happen,
i got this message in log file,

CopyOfJ_NTWX_FACT_ORDER_DETAILS..LC_MERGE_ROWS: DSD.StageRun Active stage starting, tracemode = 0.

Thnaks.

Posted: Mon Mar 12, 2007 12:45 pm
by DSguru2B
What do you mean freezing? Is the designer client freezing or the job is hanging? What additional messages do you get in the log file? Reset the job and see if you have any messages "From previous run ..."
Need more info on design as well.

Posted: Mon Mar 12, 2007 12:49 pm
by swades
I reset and run this job 3 times,
Even i did not get any warning or fatal errors,
still my job stops at this point.
i dont have any other information.
Thanks.

Posted: Mon Mar 12, 2007 12:56 pm
by DSguru2B
Then how do you expect us to help you? No info of even the design, what your trying to do in the job, the data volume etc etc etc :?

Posted: Mon Mar 12, 2007 1:06 pm
by swades
In out job 18 hash file we merge to a single hash file ,so i think problem with hash file reading or link collector stage is slow,
we collect through "Round Robin" type.
so help me Thanks.

Posted: Mon Mar 12, 2007 1:09 pm
by DSguru2B
What happens when you write to the same hashed file from these 18 hashed files without the link collectory?

Posted: Mon Mar 12, 2007 1:14 pm
by swades
Hi,
I did not try for that,
actually this job is design by another person,
and i cannt change in it,so here the design,

18 hash files----> link collector---->a single hash file.

my job ,every time freeze at this point and its shows blue link and there is not a single warning or fatal error.

so reply me thanks.

Posted: Mon Mar 12, 2007 1:17 pm
by DSguru2B
Dont change it. Just build another job to do the same except, without the link collector. This is just for debugging purposes.
I have had issues with link collectors and hashed files (connected together) before and therefore dont like them next to each other.

Posted: Mon Mar 12, 2007 1:48 pm
by swades
DSguru2B wrote:What happens when you write to the same hashed file from these 18 hashed files without the link collectory?
how could i do this job?

Posted: Mon Mar 12, 2007 1:49 pm
by swades
swades wrote:
DSguru2B wrote:What happens when you write to the same hashed file from these 18 hashed files without the link collectory?
how could i do this job?
i mean with out collector stage how can i merge 18 files.

Posted: Mon Mar 12, 2007 1:52 pm
by DSguru2B
Multiple writes to a hashed file is possible. Let your source hashed files go to either
- a single hashed file stage with the same hashed file name for each input
or
- 18 hashed file stages with the same name.

Posted: Mon Mar 12, 2007 4:05 pm
by swades
DSguru,
but single Hash file reads "Round-Robin"(sequentially) or it just append all hash file,if it append all hash file then ideally it taking more time,because it wait for all file to read.
so answer me,actually my problem was solve.
Thanks.

Posted: Mon Mar 12, 2007 4:53 pm
by DSguru2B
No my friend, it will do simultaneous reads from all of your source hashed files and load to a single hashed file.
How did you fix your problem? What was the issue?

Posted: Mon Mar 12, 2007 10:09 pm
by kumar_s
Hope your design is many to one. I mean many Hashed file to to single Hashed file. I also assume the output file is different from the source files.
That case, if any one source has large data to be transfered, the job takes that long.
And by the way, what gets your solution.
And you can mark the topic are resolved.

Posted: Tue Mar 13, 2007 6:25 am
by ray.wurlod
A hashed file is a way to implement a database table. You can have multiple streams inserting rows into it.

There is no need at all for a Link Collector stage.