Search found 19 matches

by mailravi.dw@gmail.com
Wed Mar 02, 2011 11:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load 0.5 million records into DB2 stage
Replies: 3
Views: 2659

Thanks for the clarification.

If i enabled execution mode as parallel for DB2 API stage then is there any performance improvement. I could see default execution mode is Sequential.
by mailravi.dw@gmail.com
Wed Mar 02, 2011 11:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load 0.5 million records into DB2 stage
Replies: 3
Views: 2659

Load 0.5 million records into DB2 stage

Hi Experts,

I have 0.5 million records. I am planing for job design as follows.

Dataset --> Transformer --> DB2 Stage

My target table is not Partitioned. Please suggest me, Whether DB2 API stage or DB2 Enterprise Stage would give more optimisation.
by mailravi.dw@gmail.com
Thu Feb 17, 2011 2:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load last 2 Entries
Replies: 14
Views: 6146

It solved my problem.
by mailravi.dw@gmail.com
Wed Feb 16, 2011 10:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load last 2 Entries
Replies: 14
Views: 6146

Yes, My main worry is how to pick either first/ last 2 records from each group "CNO". Could any one give some light on it.
by mailravi.dw@gmail.com
Wed Feb 16, 2011 10:05 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load last 2 Entries
Replies: 14
Views: 6146

Thanks for your input. As per your suggestion, I will do sort based on CNO and ENTRY_TIMESTAMP in reverse order(Desending order). How to pick first "X" records for each CNO.
by mailravi.dw@gmail.com
Wed Feb 16, 2011 9:54 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Load last 2 Entries
Replies: 14
Views: 6146

Load last 2 Entries

Hi My source and target are csv file. It contains the data as follows. CNO CNAME RISKRATINGS ENTRY_TIMESTAMP 100 , ABC , 2 , 2011-02-03 10:15 100 , ABC , 3 , 2011-02-02 10:10 100 , ABC , 1 , 2011-02-04 12:15 101 , XYZ , 2 , 2011-02-03 10:12 101 , XYZ , 3 , 2011-02-04 16:13 For each CNO i want last 2...
by mailravi.dw@gmail.com
Sat Nov 20, 2010 12:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Need Days from two Dates
Replies: 9
Views: 4555

I am assuming Date1 and Date2 attributes are varchar and provided the below Derivation.

Derivation: DaysSinceFromDate(StringToDate(Date1,"%mm/%dd/%yyyy"), StringToDate(Date2,"%mm/%dd/%yyyy"))

This function returns the interger (int32) value.
by mailravi.dw@gmail.com
Fri Nov 19, 2010 8:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Convert Uint64 to Int64
Replies: 7
Views: 12878

Hi ArndW,

Thanks for your inputs and time.

Thanks
Ravi K
by mailravi.dw@gmail.com
Fri Nov 19, 2010 7:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Convert Uint64 to Int64
Replies: 7
Views: 12878

Hi Andrew,

The conversion is taking place implicitly.

How can we explicitly convert unsigned int32 to a signed int64? Is there any function available. If yes, kindly confirm the function.

Thanks
Ravi K
by mailravi.dw@gmail.com
Fri Nov 19, 2010 3:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Convert Uint64 to Int64
Replies: 7
Views: 12878

I Managed to resolve the warning. At target side also i defined it as uint64 as "Datatype is BIGINT and enabled Unsigned at Exteneded option of Columns Tab " rather than converting to int64(BIGINT). Really is there any conversion functions available which is used to convert uint64 to int64...
by mailravi.dw@gmail.com
Fri Nov 19, 2010 3:06 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Convert Uint64 to Int64
Replies: 7
Views: 12878

you are absolutely right. It is automatically converting uint64 to int64 and result is also fine.But i am getting a warning as below. When checking operator: When binding input interface field "ROWNUMBER_COL" to field "ROWNUMBER_COL": Implicit conversion from source type "ui...
by mailravi.dw@gmail.com
Thu Nov 18, 2010 7:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to Convert Uint64 to Int64
Replies: 7
Views: 12878

How to Convert Uint64 to Int64

Actually i am using "Row Number column" property of Sequential file to generate Sequence Number. The "Row number Column" produces unsigned value like uint32 or uint64. I need to forcefully convert uint64 to int64 why because i could not locate Extended option at Sequential file c...
by mailravi.dw@gmail.com
Tue Sep 14, 2010 12:27 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Dynamic Parameters
Replies: 10
Views: 4037

Use variable acvity and apply functions like iconv, Oconv then you easily get it through. if you need full information on iconv and oconv then refer the documentation. it clearly explained.

Thanks
Ravi
by mailravi.dw@gmail.com
Tue Sep 14, 2010 12:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: how to implement in datastage?
Replies: 5
Views: 3342

Use Change Capture Stage, Identify which is before dataset and after dataset. Based on the codes provided by Change capture stage then direct the data to the respective inserts and updates.

Thanks
Ravi
by mailravi.dw@gmail.com
Thu Sep 09, 2010 7:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reloading Data into DB2 API Stage
Replies: 1
Views: 1317

Reloading Data into DB2 API Stage

Hi All, I am trying to load data from Sequential file to DB2 Database with the help of DB2 API stage by using update action as Insert rows without Clearing. My file has 50,000 records. When the load was at 45,000 record the job got aborted due to some reason. Then we are reloading from scratch.(we p...