Search found 48 matches

by mansoor_nb
Wed Jun 11, 2014 3:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Limit to Multiple instance of sequences and D's parallel job
Replies: 5
Views: 3096

Hi All, Thanks for your invaluable response. Based on the inputs provided and the information available as per the links shared, i have checked the uvconfig file in our PROD box. The value defined for T30FILE & RLTABSZ are 4096 & 480 respectively. This value will let 900 jobs to be executed ...
by mansoor_nb
Mon Jun 09, 2014 11:11 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Limit to Multiple instance of sequences and D's parallel job
Replies: 5
Views: 3096

Limit to Multiple instance of sequences and D's parallel job

Hi, DS V9.1 AIX V7.0 We have designed a common generic RCP driven DS parallel jobs to read multiple types of source files like EBCDIC & ASCII formats having multiple file layouts using a set of filter, SF, Generic stage to load the data into table in teradata database. I wanted to know is there ...
by mansoor_nb
Fri Mar 01, 2013 8:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Compilation Issue
Replies: 7
Views: 5067

The stage variable logi is implemented and the jobs were split into 3 to make the jobs compiled. After doing this change, the jobs got compiled.
Thank you very much to all for helping.
by mansoor_nb
Wed Feb 27, 2013 11:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Compilation Issue
Replies: 7
Views: 5067

Thanks for showering light on this issue. I will try out the options provided by all of you and let you know the results.
by mansoor_nb
Tue Feb 26, 2013 10:46 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Compilation Issue
Replies: 7
Views: 5067

Hi Ray,

The system is configured properly. Thisis the first time that we ar egetting this issue as we have jobs where the data validations are being done for more than 100 fields.

Thanks
by mansoor_nb
Tue Feb 26, 2013 6:16 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Compilation Issue
Replies: 7
Views: 5067

Compilation Issue

Hi, I am getting the compilation error. The job reads the data from the sequential flat file, does data validation like - data type (Date, Decimal,Integer) and Null/Empty check for the incoming fields in the source file. There are approximately 150+ fields where the data validation is being done. On...
by mansoor_nb
Wed Apr 18, 2012 3:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ANSI to UTF-8 Conversion
Replies: 0
Views: 2255

ANSI to UTF-8 Conversion

Hi All, I want to convert the sequential file from ANSI to UTF-8 format. I have tried setting the NLS MAP to UTF-8 at the project level and the NLS MAP at the stage level is also set to UTF-8 just to make sure. The record delimiter is set to UNIX Newline. Still the file is not geting created in the ...
by mansoor_nb
Fri Feb 25, 2011 2:38 am
Forum: General
Topic: Problem with Email Notification Stage
Replies: 5
Views: 2975

Hi Divya, You will get the check box "Do Not Check Point Run" only if you check the check box "Add Checkpoints so sequence is restartable on failure" in the sequence properties. By doing this you can control the notification activity whether you have to trigger it everytime or no...
by mansoor_nb
Fri Feb 25, 2011 1:59 am
Forum: General
Topic: Problem with Email Notification Stage
Replies: 5
Views: 2975

HI, In the sequence properties, you need to check the check box which says "Add Checkpoints so sequence is restartable on failure". Then in the notification activity, you need to check the check box "Do Not Check Point Run". I think, by doing this your issue will get resolved. Th...
by mansoor_nb
Fri Feb 25, 2011 1:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Passing Dataset value to Oracle Database stage
Replies: 4
Views: 2509

Hi Murali,

Perform a lookup or use a join stage between the Dataset and the database table.
by mansoor_nb
Thu Feb 17, 2011 4:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load same teradata table 2 times in a single job
Replies: 21
Views: 10737

Hi Vidyut, I knew that the first option will result in Deadlock, that's why i had asked you to handle it :wink: The another option is, Load all the unique records using the FASTLOAD method and write all the duplicate records into a file. Write a BTEQ script to load the duplicate records into the sam...
by mansoor_nb
Thu Feb 17, 2011 3:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load last 2 Entries
Replies: 14
Views: 6137

It is not necessary to execute the transformer in the sequential mode. You have to maintain the same partition as the sort stage. It will work.
If the volume of data is high then executing the transformer in the sequential mode can be costly affair.

Thanks
by mansoor_nb
Thu Feb 17, 2011 2:33 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load same teradata table 2 times in a single job
Replies: 21
Views: 10737

As you said the duplicate records are very minimal (in hundereds), identify these duplicate records and seperate it with the main stream. Using the link ordering execute the link which has only duplicate records and then the other one. Something like below. Source ------> Identify Unique/Duplicates ...
by mansoor_nb
Tue Feb 15, 2011 10:26 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to get the last record in a sur key state file
Replies: 2
Views: 2202

You can also use Tail stage in the sequential mode by setting the Number Of Rows (Per Partition) = 1. By this the last row is retained for down stream process.

SKey ---> Tail --> ...

Thanks
by mansoor_nb
Fri Feb 11, 2011 3:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: CSV file issue
Replies: 4
Views: 3422

If you are sure that you will get the values seperated by comma in the file Company Name, then read the entire record as a single column and then split based on the field length. Remember while splitting, the starting position of the second column will be n+1 as there is a comma seperating the first...