Search found 51 matches

by ds_teg
Thu Sep 23, 2010 1:54 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Copying teradata table into another table
Replies: 10
Views: 6519

DBA's are skeptical bout using that alter statement as it has corrupted the data in V2R5 version of teradata . This is the reason why I am exploring different options .
by ds_teg
Thu Sep 23, 2010 1:04 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Copying teradata table into another table
Replies: 10
Views: 6519

I have fired the below query to check the space occupied in the table :

select databasename, tablename, sum(currentperm) , sum(peakperm)
from dbc.tablesize
where databasename = 'dbnae'
and tablename = 'table1'
group by databasename , tablename;

Total size : 312231815168.00 :shock:
by ds_teg
Thu Sep 23, 2010 12:52 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Copying teradata table into another table
Replies: 10
Views: 6519

That explains ...Here I have nearly 4 billion with 300 columns ..So its way big ...I wondering if any other approah is there to load the data
by ds_teg
Thu Sep 23, 2010 12:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Copying teradata table into another table
Replies: 10
Views: 6519

Anbu ,

which method you have used ? how many columns are there ?
by ds_teg
Thu Sep 23, 2010 12:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Copying teradata table into another table
Replies: 10
Views: 6519

Andy ,Thanks for your quick response . Any idea how the performance will be to copy 4 billion records with 350 columns through an insert into select * statement . Please let me know what are the considerations I need to take before doing this . I believe I cannot use an MLOAD or Fastload to populate...
by ds_teg
Thu Sep 23, 2010 11:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Copying teradata table into another table
Replies: 10
Views: 6519

Copying teradata table into another table

I am working in datastage and teradata . I have table for which datatype is varchar(20) . I have nearly 4 billion records present in the table . There is a requirement to change the datatype from varchar20 to varchar40 . we can use alter table even though there is data in the table . But for some un...
by ds_teg
Thu Sep 23, 2010 11:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Loading Huge Data
Replies: 5
Views: 1939

Ok thanks craig for the response .
by ds_teg
Wed Sep 22, 2010 4:57 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Loading Huge Data
Replies: 5
Views: 1939

Thanks vincent for your suggestion . I am planning to use file pttern to read files in parallel . I believe multiple readers per node wont be there if we are using file pattern option . Also , I would like to know how restartability works in using the two options that i have specified in the post . ...
by ds_teg
Wed Sep 22, 2010 8:09 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Loading Huge Data
Replies: 5
Views: 1939

Any idea on this post ?? :roll: :roll:
by ds_teg
Mon Sep 20, 2010 4:41 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Loading Huge Data
Replies: 5
Views: 1939

Loading Huge Data

I am having 10 files of same format and each is having 25 GB of data . So I need to load this 250 GB data into a teradata table . The table is not a multi set table and have one unique primary index . I need to do some quality checks like date of birth is valid date or not .These quality checks can ...
by ds_teg
Mon Jun 14, 2010 8:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance Improvement Needed
Replies: 7
Views: 3004

Matching in Database will be best approach than extracting to DataStage. Main reason is that 1.) it contains a unique index to prevent a full table scan 2.) avoids unwanted network to pull data into DS server 3.) avoids any sorting prior to join Did you gather the statistics recently for your wareh...
by ds_teg
Mon May 31, 2010 10:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup Error
Replies: 20
Views: 10640

I am also facing the same problem .

Any resolution for this ?
by ds_teg
Fri May 07, 2010 10:48 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Updates Running Very Slow
Replies: 11
Views: 3391

Sainath.Srinivasan wrote:I will suggest to load into a temporary table and then merge with your target.


Do you have any duplicates in the data ?
Hi Sai ,

Could you please explain more about the need of the temp table ?

Thanks
by ds_teg
Fri May 07, 2010 10:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Updates Running Very Slow
Replies: 11
Views: 3391

Hi Craig , I understand that we should not commit too often but here i am thinking that during the time that i have mentioned in the post , datastage is not doing anything and it just waiting for somthing to happen .I would you like to know what it is ? Here as mentioned by you the transaction size ...
by ds_teg
Fri May 07, 2010 9:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Updates Running Very Slow
Replies: 11
Views: 3391

There is an index on the column which i have specified in the where clause of update query .

Commit row interval =1000

Commit time interval= 15

Could you please let me know how to find array size and transaction size so that i can provide the same details ,