Search found 46 matches

by jinm
Fri Apr 11, 2008 4:30 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Timestamp issue inserting into Oracle table
Replies: 18
Views: 9088

source field [1,19]
But then you need to be sure your source data corresponds with the mask you provided.
Works fine for us, also when tranporting timestamps from say SQL Server to Oracle
by jinm
Tue Apr 08, 2008 10:25 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Warning in parallel job causes loss of data
Replies: 2
Views: 1889

Hi Sud Well, it may be that I use too little data, but that does not eliminate the potential of data being transferred, in cases where I want everything or nothing. Not something The time span I see is up to 4 seconds between the warning and the completion. My concern is that the second the warning ...
by jinm
Tue Apr 08, 2008 7:16 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Warning in parallel job causes loss of data
Replies: 2
Views: 1889

Warning in parallel job causes loss of data

Hi guys We are looking into parallel jobs but have met the following scenario: Warning Limit = 1 When a warning pops up in one of the threads in the job, this message is seen: Issuing abort after 1 warnings logged. but the remaining threads just keeps loading data. At the end the job finishes with f...
by jinm
Mon Mar 10, 2008 1:48 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error on import Oracle Table definitions
Replies: 2
Views: 1074

ArndW wrote:I think that this is a known V8 bug. ...
Another one to the collection :( :(

Well work around (for those who may find it usefull) is so far to use Microsoft ODBC for Oracle og IBM ODBC for Oracle whenimporting metadata, and use the plugin in the job.

Thanks for the reply
by jinm
Fri Mar 07, 2008 5:30 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error on import Oracle Table definitions
Replies: 2
Views: 1074

Error on import Oracle Table definitions

Hi When trying to import Table definition using Import -> Table Definitions -> Plug-in meta Data definitons -> OraOCI AND connecting to Oracle10gr2 database, the column length are not imported. On other Oracle versions, column length is imported. OS: Win2K3 DS: 8.0.1 Ora: Local client has been both ...
by jinm
Wed May 03, 2006 2:02 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: ORA-00942: table or view does not exist
Replies: 10
Views: 4899

what you need to ensure

A: User ID is granted access to the table.
B: If no synonym exists you need to prefix the table-name with schema owner in the select statement.

"select column from owner.tablename;"
by jinm
Wed May 03, 2006 1:58 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: NLS setting for Hashed file??
Replies: 2
Views: 2775

characters have already been mapped at the boundaries between external data and DataStage. Hi Ray and thanks for the reply. Well apparently all characters have not been mapped. job worked somewhat OK on the "NON-NLS-Enabled" server, mapping all "funny" characters to ?. What need...
by jinm
Wed Apr 26, 2006 7:28 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Write Failed for hash File
Replies: 8
Views: 2547

[quote]Just a slight correction, it's not the 880th record, the message: Write failed for record id ' 880'". means that the primary key value (the record id) is invalid for some reason.[quote] Correction. Something somewhere in record with ID = '880' is invalid. It need not be in the key column...
by jinm
Tue Nov 15, 2005 7:59 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: NLS export existing map for edit
Replies: 14
Views: 3373

Routine changes not used in released jobs !!

Quote: If you subsequently change the routine then the jobs will be affected. This is not true I'm afraid. After completing the NLS issue I tried this one. Simple Routine Ans = 1 One job uses this ruotine. Fine Release Job, and change routine to Ans = 2. (Yes- I compiled the routine) When running th...
by jinm
Tue Nov 01, 2005 7:47 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: NLS export existing map for edit
Replies: 14
Views: 3373

Hey ArndW The greatest reply so far. I suspected esactly something like that. This should imply that you would store the result of the "IMPORT NLS Map" job as a hashed fil, - target directory = Datastage\Engine\nls\maps\MAP.TABLE Load and register it in the DS Administrator, and you are ru...
by jinm
Tue Nov 01, 2005 2:18 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: NLS export existing map for edit
Replies: 14
Views: 3373

Thanks for the replies. The use of the VOC files is still somewhat of a mystery, but we are getting there :) Not to be ungrateful, but the original topic was to edit a code map in an external editor, and then import the code map into the DS Server. And on that topic I would really appreciate if ther...
by jinm
Fri Oct 28, 2005 1:53 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: NLS export existing map for edit
Replies: 14
Views: 3373

Hi Arn Sorry to disagree. Quote: "all that they contain are links to the routine" Un-quote. Yes. Provided the routine is present in the given DS Project. To illustrate: PROJECT_D (devl) is located at server AA100 We release the DS Job which embeds the code in the routines used within the j...
by jinm
Thu Oct 27, 2005 11:59 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: NLS export existing map for edit
Replies: 14
Views: 3373

Thanks for all the replies. Perhaps I wasn't clear enough. As to recompiling jobs. In environments other than developement we only have released jobs. These jobs take in the routine code as they were on the time of release. Consequently a changed routine will not be used in our validation and produc...
by jinm
Thu Oct 27, 2005 10:05 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: NLS export existing map for edit
Replies: 14
Views: 3373

Hi ArndW Thank you for a good reply - not the one I hoped to see, but still :o) Well it is more like we are loading into a "non-unicode" environment We don't want to get mapping errors, and a lot of the characters must be converted to pre-defined characters. The nice thing would be to take...