Search found 320 matches

by mydsworld
Wed Jan 28, 2009 8:02 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Importing metadata of file with string delimiter
Replies: 2
Views: 847

Delimiter string option is there in the Seq File stage.But in 'Define Sequential Meta Data' window (while importing the metadata of the file), there are options like Tab,Space,Comma,Other Delimiter and there is no string delimiter option.
by mydsworld
Tue Jan 27, 2009 9:23 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Importing metadata of file with string delimiter
Replies: 2
Views: 847

Importing metadata of file with string delimiter

I am trying to import metadata of a file with delimiter ~@
Please let me know how do I specify the delimiter (what should be mentioned in the 'Other Delimiter'.Specifying ~@ doesn't work.

Thanks.
by mydsworld
Thu Jan 22, 2009 4:40 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Handling accented characters
Replies: 5
Views: 1719

Actually I dont want to convert these accented characters into something else and load, rather would like to load them as is.
Thanks for your response.
by mydsworld
Thu Jan 22, 2009 4:21 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capturing Reject records
Replies: 15
Views: 4466

I am using ODBC Ent stage that has write option 'Upsert' and 'Write'. I dont find any option called 'Load'.
by mydsworld
Thu Jan 22, 2009 4:12 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capturing Reject records
Replies: 15
Views: 4466

The requirement is to capture the Reject records (failing to get inserted in database table) in a file that can be shared with other application or can be mailed.Thats why I need to use File and not Peek.
by mydsworld
Thu Jan 22, 2009 4:09 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Handling accented characters
Replies: 5
Views: 1719

Has anyone encountered this problem earlier and can advise me.

Thanks.
by mydsworld
Thu Jan 22, 2009 3:48 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capturing Reject records
Replies: 15
Views: 4466

Ray, Believe me, I am directly taking the Reject link from target ODBC Ent to Seq File (with no Transformer in between). It does give that error. This was my original design. Now, if I replace Seq File (at the other end of Reject link) with Peek, it works. So, any idea why I am not able to use Seq f...
by mydsworld
Thu Jan 22, 2009 2:47 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Handling accented characters
Replies: 5
Views: 1719

My job design is following : Seq File -> Transformer -> ODBC Ent (with Reject link) I have set 'UTF-8' in only ODBC Ent stage. The error due to the accented char is shown below : Character data, right truncation occurred; for example, an update or insert value is a string that is too long for the [o...
by mydsworld
Thu Jan 22, 2009 1:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Handling accented characters
Replies: 5
Views: 1719

Handling accented characters

I am trying to insert accented characters into a DB2 table using ODBC Ent stage.The accented character records are getting rejected. Is there any way we can insert them (without using Datastage 'convert').

Thanks.
by mydsworld
Thu Jan 22, 2009 9:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capturing Reject records
Replies: 15
Views: 4466

Thats the problem, the 'sqlcode' appears in grey (read-only), so can not be removed from Seq file or Transformer.Even if I remove it from Transormer right side it gives that same error.
by mydsworld
Wed Jan 21, 2009 7:27 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capturing Reject records
Replies: 15
Views: 4466

I created the Reject link from ODBC Ent to Seq File (directly without any Transformer etc).It is giving the error : Sequential_File_20: Error when checking operator: Could not find input field "sqlcode". [api/interface_rep.C:2168] When I place Peek stage instead of Seq File, the job works....
by mydsworld
Wed Jan 21, 2009 4:06 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capturing Reject records
Replies: 15
Views: 4466

I introduced a Transformer in the Reject link between ODBC Ent and Seq file (to drop 'sqlcode'). Now it is giving the same error on that Transformer. So, there is some setting in the ODBC Ent stage that adds that 'sqlcode' field in the Reject link.

Please advice.
by mydsworld
Wed Jan 21, 2009 11:12 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Capturing Reject records
Replies: 15
Views: 4466

Capturing Reject records

I have a job design like this : Seq File -> Transformer -> DB2 API While loading DB2 table, some records are getting dropped in DB2 API stage. I need to capture those Reject records in a file (not from Director) Now, DB2 API stage doesn't allow to have a Reject link. When I use ODBC Ent stage instea...
by mydsworld
Mon Jan 19, 2009 2:43 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parallel Routines
Replies: 1
Views: 814

Parallel Routines

For writing Parallel Routines in C, how and where to get the list of DataStage APIs and the header file names to be included.

Thanks.
by mydsworld
Sun Jan 18, 2009 11:26 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reject Link processing
Replies: 1
Views: 938

Reject Link processing

I am capturing Reject records while reading a Sequential file. Now I want to apply some string function on the rejected fields in the same job. But rejected fileds being VarBinary can not apply them, neither I find any function to convert it into string (from raw data type).

Please advise.