Search found 82 matches

by bapajju
Fri Jul 16, 2004 12:35 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Locking of Table
Replies: 13
Views: 4956

Re: Locking of Table

Hi Srini,
We r also getting the same Lock Issue. Kindly let us know if you have got solution to this by now.

Thanks
by bapajju
Fri Jul 16, 2004 12:32 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Locking of Table
Replies: 13
Views: 4956

Ray,
Kindly let us know if there is any round about way to handle Update and Insert in a single job.


Thanks
by bapajju
Fri Jul 16, 2004 12:31 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Locking of Table
Replies: 13
Views: 4956

Ray,
Kindly let us know if there is any round about way to handle Update and Insert in a single job.


Thanks
by bapajju
Fri Jul 16, 2004 12:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Not able to connect from client
Replies: 4
Views: 1381

Re: Not able to connect from client

Try to remove the Installation fully and re-install it again.In the previous installation you have probably ignored some warning messages. Also please do check the System date of your Machine, it might have a conflict with your License Date Ranges too, MAY BE :(
by bapajju
Fri Jul 16, 2004 12:12 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Gets into a HANG State
Replies: 4
Views: 1066

Thanks for the instruction, Rasi. But Can't we achieve this in a Single Job only.When we also do it in Two Jobs we get the perfect,but in a Single job it hangs. Could you please let us know why this happens and can't we achieve this in a single job only.
by bapajju
Thu Jul 15, 2004 11:04 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Job Restart from the abort point
Replies: 4
Views: 3121

kduke wrote:You could use a constraint to start at @INPUTROW > StartRow where StartRow is a parameter. I am not sure if this works in PX.
Thanks Duke.We'r trying the same.
by bapajju
Thu Jul 15, 2004 11:01 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Gets into a HANG State
Replies: 4
Views: 1066

Job Gets into a HANG State

Hi All, We are developing a Job that will take care of SCD Type-2. First let me explain how we are implementing the same. 1.Data Extracted from Source using Teradata Access Plug-In 2.We then look up the Target table and basing upon the constraint and look up we Update or Insert the Target table in t...
by bapajju
Sun Jul 11, 2004 6:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Purging Files and Custom Log generation
Replies: 1
Views: 1655

Purging Files and Custom Log generation

Hi all,
Can anybody let me know how to generate Custom Log reports and perform File PURGE Procesing through Parallel extender. We need to purge the files that have been processed without any eeror and if the file is not processed properly then it should not be moved from the input directory.
by bapajju
Sun Jul 11, 2004 6:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Summary Report Generation for successful job run
Replies: 1
Views: 1753

Summary Report Generation for successful job run

Hi All, I am loading data from DB2 table to another DB2 table. I want to generate a report that will give me details of the load like: 1. Number of rows processed. 2.Start time and time Input number of rows. 3. Aggregated values of the Numeric fields processed. e.g. Total Sales in Dollars for all th...
by bapajju
Sun Jul 11, 2004 6:34 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Job Restart from the abort point
Replies: 4
Views: 3121

Job Restart from the abort point

Hi All, I am using Parallel Extender 7. I am extracting data from DB2 and the number of records is 5500000000 (550M). I am putting this data into DB2. Now my job aborts at 250M (250000000) the record. Now when doing a reload I want to start it from 2500000001 th record. Kindly let me know how to ach...
by bapajju
Thu Feb 05, 2004 6:25 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Counting the number of rows in a sequential file
Replies: 4
Views: 1562

Counting the number of rows in a sequential file

I have a sequential file as source and another sequential file as target. I want to capture the number of records of the source file in the target. Please lemme know how to achieve this.

Thanks
Bapajju
by bapajju
Mon Jan 19, 2004 3:40 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Lookup Performance
Replies: 3
Views: 1495

Thanks Bland:).I got a fair idea. A one row hash file should be a one row table every time for a reference lookup. A hash file is a local reference object on the server, a table suffers network overhead plus query preparation plus results fetch plus spool. A hash file can preload to memory making ev...
by bapajju
Tue Jan 13, 2004 4:36 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Lookup Performance
Replies: 3
Views: 1495

Lookup Performance

Hi, I tried a lookup using a hash-stage for 5000 records. The same lookup I tried through OraOCI stage and both of them effectively took the same time. But While tried the same sort of experiment with 4000000 records I found that Hash-Stage is far far faster than the OraOCI. Just wanted to know if ...
by bapajju
Thu Dec 11, 2003 2:53 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Treatment of NULLS in DS
Replies: 1
Views: 998

Re: Treatment of NULLS in DS

I simulated the same case in Windows environment and to my utter surprise :o I too got the same result i.e instead of NULL value I got 0.00.Just trying some round about ways to resolve the issue and will keep you posted incase I get any solution.
by bapajju
Wed Dec 10, 2003 6:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Aggregator Stage Tremination
Replies: 3
Views: 1220

Thanks a ton Ray.It worked afer sorting.It was a memorry problem only.Appreciate your help.