Search found 21 matches

by tsamui
Tue May 08, 2012 4:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Decimal field validation
Replies: 2
Views: 2006

Re: Decimal field validation

Have you tried with Num function in transformer?
by tsamui
Fri Apr 27, 2012 5:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Please help of the job design
Replies: 6
Views: 3397

Re: Please help of the job design

Use the below function. It will remove '-', '(', '(' and double quote and comma.

Convert('(),-"','', col)

I have tested with the value "(-123,45)" and it returning 12345.

Let me know if it satisfy your requirement.
by tsamui
Fri Apr 27, 2012 5:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Please help of the job design
Replies: 6
Views: 3397

Re: Please help of the job design

Use Trim fuction with option A.
Trim(col, '-', A). Repeat the same thing for ')' ans '('.
by tsamui
Thu Nov 12, 2009 12:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Surrogate Key State File Update problem
Replies: 1
Views: 2173

Surrogate Key State File Update problem

Sometimes, the State File may be corrupted or deleted. So I need to create a set of jobs which update the state file with the proper value. To achieve this I have created 3 jobs. 1-> Delete the existing State File -- this is working fine. 2-> Create the State file -- this is also working fine. 3-> U...
by tsamui
Mon Nov 09, 2009 5:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error in surrogate key generator stage
Replies: 11
Views: 5676

Update Sorrogate Key State File problem

Hi, I am getting the same error like Unable to write to state file /home/dsadm/SG/error_log_skey.sf: Invalid argument. We have a surrogate key state file update utility job. The job is reading the surrogate key values from a dimension oracle table in ascending order and then updating the state file....
by tsamui
Tue Apr 07, 2009 3:14 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Timestamp Conversion
Replies: 9
Views: 7496

Change the POLICY_EFF_DT data type to Timestamp. I think OCI stage will able to read the date properly.

Actually if you import the metadata of the table by 'Import Table Definition' functionality, the data type of the column will be Timestamp.
by tsamui
Sat Dec 15, 2007 5:51 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SQL Loder problem
Replies: 5
Views: 5581

You are right chullet. The DBA also told the same thing. Actually there were 18 million records in the source and we are using complex join also. When we decrease the no of records, the job is running fine. The environment is test environment and I think this environment is not capable enough to han...
by tsamui
Sat Dec 15, 2007 5:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SQL Loder problem
Replies: 5
Views: 5581

You are right chullet. The DBA also told the same thing. Actually there were 18 million records in the source and we are using complex join also. When we decrease the no of records, the job is running fine. The environment is test environment and I think this environment is not capable enough to han...
by tsamui
Tue Dec 11, 2007 10:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SQL Loder problem
Replies: 5
Views: 5581

SQL Loder problem

Hi All, In my job I am using db2 as a source and oracle as the target. The job truncates the target table first and then loads the data. To run this job I am getting some sqlldr error. I am giving some log entry from datastage log. Message: stg_CUSTOMERS: When checking operator: The -index rebuild o...
by tsamui
Wed Aug 01, 2007 8:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Failed job showing job status OK
Replies: 1
Views: 9460

Failed job showing job status OK

Hi all, In my parallel job my target table is one Oracle table which has indexes and one primary key. I am using oracle enterprise stage and put the index mode option Rebuilt. When any duplicate record come from input, it load the duplicate data in target and showing job status is OK, finish with wa...
by tsamui
Mon Jul 16, 2007 1:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Single big CSV file to many small CSV files
Replies: 6
Views: 3650

Thanks for the reply.

The number of lines is not known and maximum may be anything.

Is there any way I can divide one CSV file to four output CSV files?
If this is possible, that will be a good solution for me.
by tsamui
Sun Jul 15, 2007 7:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Single big CSV file to many small CSV files
Replies: 6
Views: 3650

Single big CSV file to many small CSV files

Hi All, I have big CSV files and I have to devide it into many small CSV files depending on the size of the input file. In every out file first row column names is true. As a example I have one input file sample.CSV which has 250 rows. ETL will process the file and make 3 outfiles sample1.CSV, sampl...
by tsamui
Sat Jul 14, 2007 2:08 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Automatically handle activities that fail
Replies: 3
Views: 2226

In details I have five parallel jobs. Each job copy data from data warehouse to one another database. I create a sequence and create five jobs activity using these parallel jobs in this sequence and there is no connection (dependency) between them. If I run this sequence and one job activity abort, ...
by tsamui
Wed Jul 11, 2007 10:08 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Sequence succeed when job fails
Replies: 2
Views: 1432

Sequence succeed when job fails

Hi all, I am using Job activity in a Sequence. The sequence always succeed whether actual parallel job succeed or not. I am using IBM Tivoli Workload Scheduler for the scheduling process and I schedule the sequence (not the job) through Tivoli Workload Scheduler. I always getting the success status ...