Search found 46 matches

by nishadkapadia
Wed Dec 16, 2009 6:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup being used on table which gets update in the same job
Replies: 9
Views: 5071

As suggested earlier by experts it is best advised to split, however following could achieve the purpose. You could first identify the Inserts/Updates by doing a look-up,hence for your example EMPNO ENAME 7368 DEV (Insert) 7369 SMITH (Insert) 7369 SMITH (insert) Subsequently, within a transformer yo...
by nishadkapadia
Wed Feb 04, 2009 9:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Looping Construct in datastage
Replies: 3
Views: 2391

Re: Looping Construct in datastage

There could be more simpler & smarter solutions suggested further . One of the naive possibilities include 2-3 jobs. Counter Initialization based on keys could be 1 DS job or could be merged. Subsequently,have a single DS job which does this validation , sorted on ascending counter for the keys....
by nishadkapadia
Tue Jan 27, 2009 1:04 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ORCH_WORK_<hexvalue>
Replies: 2
Views: 2204

Thanks keshav on confirmation
by nishadkapadia
Sun Jan 25, 2009 7:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ORCH_WORK_<hexvalue>
Replies: 2
Views: 2204

ORCH_WORK_<hexvalue>

At times there are ORCH_WORK_<hexvalue> tables left behind in the database. Is this <hex value> stored somewhere which could be used to identify the tables. Currently Information available: Possibly through the Director ORCH_WORK_<hexvalue> tables get auto dropped on re-execution of DS Jobs. Alterna...
by nishadkapadia
Sun Oct 19, 2008 12:09 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Does Teradata API Stage Support Datatype - DATE
Replies: 4
Views: 3079

Re: Does Teradata API Stage Support Datatype - DATE

We have faced similar problem whilst reading from TD API stage with date as datatype works erratically, hence char(10) is the safest option.
by nishadkapadia
Sat Sep 27, 2008 1:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata API Stage Warning
Replies: 4
Views: 3989

Re: Teradata API Stage Warning

Has been posted in the forum quite a few times - Teradata API supports ANSI SQL. This error is seen to occur also if there are any charachters [ e.g, char(10), char(13) etc ]
by nishadkapadia
Wed Sep 10, 2008 11:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to execute a Teradata Macro in a DS job?
Replies: 5
Views: 3941

What specific problem you are facing, since if using Teradata API stage and the normal method of mentioning 'USING <<column_name>> , followed by exec <<macro_name>> ( :column_name ).

Do please post any specific message / error you are facing.
by nishadkapadia
Sat Aug 16, 2008 12:51 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata multiset tables and duplicates
Replies: 6
Views: 7519

Currently, no options are provided to change 'any' settings on orch_xxx tables within DS. Alternative is to create a seq file and then invoking a external mload script to insert into TD table , which will be faster, even after considering the I/O for creating a seq file for huge volumes of data. As ...
by nishadkapadia
Sat Aug 16, 2008 12:49 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata multiset tables and duplicates
Replies: 6
Views: 7519

Currently, no options are provided to change 'any' settings on orch_xxx tables within DS. Alternative is to create a seq file and then invoking a external mload script to insert into TD table , which will be faster, even after considering the I/O for creating a seq file for huge volumes of data. As ...
by nishadkapadia
Sat Aug 16, 2008 12:08 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata PX
Replies: 14
Views: 5256

following could be some pointers related :
check pls on the index and it's skewness whilst loading into table.[ can get in touch with DBA if required ]
any full row duplicate records exist
any index related duplicate records exist
by nishadkapadia
Sun Mar 02, 2008 10:01 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata Enterprise Stage
Replies: 7
Views: 5296

Few options which could be tried { if not already } You could try breaking the job and having target as sequential file, with the combinable operators being disabled. Alternatively, could check on o/s whether there is another job already running when the various options were tried. Or as suggested c...
by nishadkapadia
Sun Dec 09, 2007 12:49 am
Forum: General
Topic: Session Error
Replies: 2
Views: 2527

Re: Session Error

The following entry is mentioned in your log
---Teradata_Enterprise_0,0: Cannot write record.

Looks like ,check the metadata in Teradata and in the source dataset
would be good place to start.
by nishadkapadia
Sun Dec 09, 2007 12:47 am
Forum: General
Topic: Session Error
Replies: 2
Views: 2527

Re: Session Error

The following entry is mentioned in your log
---Teradata_Enterprise_0,0: Cannot write record.

Looks like ,check n the metadata in Teradata and in the source dataset
would be good place to start.
by nishadkapadia
Fri Oct 19, 2007 4:34 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Where is the Reject Data of Teradata Enterprise Stage?
Replies: 12
Views: 8890

Duplicate rows will never be captured in any database table.
Since teradata enterprise stage uses native teradata fastload utility which
has this feature.
At the most, field Conversion errors and Duplicate Index errors will be captured in the error tables which is zero as per your log.
by nishadkapadia
Tue Jun 12, 2007 9:21 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Teradata Write permission error
Replies: 4
Views: 2292

I believe you are not using named pipes as a option on the utilities.
it is preferable that you also check 'Delete data file after load' option
on the utilities.

That way both the developers would be able to run their jobs one after another.