Search found 284 matches

by abhilashnair
Mon Apr 04, 2011 5:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: odbc problem with DB2 cannot upsert due to Index error
Replies: 4
Views: 2024

Assuming this to be true, Do you mean both duplicates tried to insert at the same time ? How did Hash Partition resolve this then ?
by abhilashnair
Sun Apr 03, 2011 11:23 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: odbc problem with DB2 cannot upsert due to Index error
Replies: 4
Views: 2024

When I changed the partitioning to Hash, hash keys were same as the update keys used in the query, the error went away.

This is strange. The partitioning was Auto previosuly and the job used to work fine. Can this be explained?
by abhilashnair
Sun Apr 03, 2011 2:39 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: odbc problem with DB2 cannot upsert due to Index error
Replies: 4
Views: 2024

odbc problem with DB2 cannot upsert due to Index error

I am trying to load DB2 table in Upsert method. Updates are happening first and then Inserts. I am getting the error INDEX1 RESTRICTS COLUMNS WITH SAME VALUES. TABLE NAME But if it is Update Then Insert then why is this issue happening? Even if there are duplicates in my source, the problem should h...
by abhilashnair
Fri Apr 01, 2011 10:38 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SIGSEGV and Partitioning
Replies: 11
Views: 5249

Looks like, I need to go for trial and error here. Not sure, is there any other way to calculate exactly what should I mention in the Insert Array Size
by abhilashnair
Fri Apr 01, 2011 4:39 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SIGSEGV and Partitioning
Replies: 11
Views: 5249

Dropped it to 1, it worked....too slow although...is this the only way ?
by abhilashnair
Tue Mar 29, 2011 3:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SIGSEGV and Partitioning
Replies: 11
Views: 5249

ray.wurlod wrote:How big (in bytes) are your rows?

Is the product of array size and row size too large to fit in memory? ...
Array Size specified in Target is 2000. As for the Row Size, since this is a delete operation, The metadata of the target only has the key column. the Mode is Upsert/Delete Only
by abhilashnair
Mon Mar 28, 2011 11:31 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SIGSEGV and Partitioning
Replies: 11
Views: 5249

chulett wrote:Try hash partitioning on the key field(s) used in the upsert. ...
Same Result even after Hash Partitioning. SIGSEGV error on target ODBC.
I have disabled Operator combination
by abhilashnair
Mon Mar 28, 2011 6:52 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SIGSEGV and Partitioning
Replies: 11
Views: 5249

priyadharsini wrote:what is the partition defined on ODBC stage?
Auto
by abhilashnair
Mon Mar 28, 2011 4:38 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SIGSEGV and Partitioning
Replies: 11
Views: 5249

SIGSEGV and Partitioning

When a SIGSEGV error goes away when we change the configuration file from multiple node to single node, does this mean that the Partitioning is the culprit in this case. I have a job which fetches from ODBC and inserts into ODBC stage. The mode is Upsert(Update Then Insert). The job fails with SIGSE...
by abhilashnair
Sat Mar 26, 2011 7:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Leading & trailing zero's in column export stage
Replies: 5
Views: 4485

If you have a sequential file, which is storing your rejects, then DataStage will always add leading zeroes to decimal values. You may need to have a transformer before the column export stage and then convert the decimal value input into string and pass on the output as Varchar data type. The funct...
by abhilashnair
Fri Mar 25, 2011 11:46 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Converting Non-Ascii characters
Replies: 3
Views: 3117

There's no such thing as junk characters. They are your client's data. If they're in the source database, they're valid, and you have to move them. First step is to find out what they actually are. ... Source Database is different from target. Is that the reason ? Source is SQL Server, Target is DB2
by abhilashnair
Fri Mar 25, 2011 3:46 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Converting Non-Ascii characters
Replies: 3
Views: 3117

Converting Non-Ascii characters

I have a job where data is being fetched from SQL Server DB and populated into DB2 target. The target is truncated and written every time. I am using ODBC stage in the job(Source as well as target). There are two columns in the source table each of Varchar(8000). These columns are also present in ta...
by abhilashnair
Wed Mar 23, 2011 8:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Not able to reject rows from transformer
Replies: 8
Views: 8143

Whats the structure of your job?
by abhilashnair
Wed Mar 23, 2011 3:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Good ol' sequential file stage warning with a twist!!!!
Replies: 11
Views: 4792

Changed the Partitioning in target ODBC to Same. It was hash before. And the warning went away. Still not sure what caused this in the first place. Anyway the issue is resloved now
by abhilashnair
Wed Mar 23, 2011 3:03 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: DB2 UDB look up Error
Replies: 4
Views: 2905

tanaya.deshpande@tieto.co wrote:I am getting this error for any query type...
Check the query inside the stage. The WHERE condition seems to be incomplete. Try running the query directly on database and verify that rows are being returned. Then run the DS job