Search found 39 matches

by rohit_mca2003
Thu Sep 07, 2017 9:41 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: How to handle double quote in Column Name (Teradata)
Replies: 1
Views: 3048

How to handle double quote in Column Name (Teradata)

Hi, I have a requirement to handle double quote in the teradata column name. Since teradata does not accept column which are same as reserved words so it has been created as "TITLE". I have a RCP enabled job, where source and target table have this kind of column ("TITLE"). While...
by rohit_mca2003
Mon Jul 31, 2017 9:01 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: HASH Partition not working for Checksum values
Replies: 5
Views: 3240

When I say join is not happening properly it means if I run the join in sequence or in entire partitions then it is working fine.
but with HASH partition, partition does not seems to be working fine and records from both side (with same key) seems to be on different partition.
by rohit_mca2003
Mon Jul 31, 2017 4:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: HASH Partition not working for Checksum values
Replies: 5
Views: 3240

HASH Partition not working for Checksum values

Hi, I need to join the columns (using join stage) which have MD5 hash values (using Checksum stage for this). I have same data in source and target so expected to match all the records but join is not happening properly. I am doing HASH partition before join. When analysed output of HASH partition t...
by rohit_mca2003
Thu Jul 27, 2017 2:38 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Join the columns having HASH values
Replies: 3
Views: 3758

Hi,

I checked the checksum values and it is same what ever is in target. Also if DataStage generates different checksum for same value then it should not be used.

Thanks.
by rohit_mca2003
Thu Jul 27, 2017 1:36 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Join the columns having HASH values
Replies: 3
Views: 3758

Join the columns having HASH values

Hi, We have requirement to join the columns having HASH value (these hash values have been computed by checksum). First I tried by 'hash Partition' and sort on this column (which has hash value) from both side of join but all the records do not match. But I expected all the records should be matched...
by rohit_mca2003
Tue Jul 18, 2017 1:36 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error when generating checksum on vector sub record column
Replies: 0
Views: 2166

Error when generating checksum on vector sub record column

Hi, I have a requirement where I have to generate checksum for vector sub record (records get combine based on key column) but while doing so I am getting below error: "Traceback: Could not obtain stack trace; check that 'dbx' and 'sed' are installed and on your PATH" I checked these are a...
by rohit_mca2003
Wed Feb 22, 2017 8:32 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Algorithm used (by default) for Checksum Stage
Replies: 2
Views: 2192

Thanks Chulett,
I can see output generated by CHECKSUM is 32 character alphanumeric string.
As you suggested, I assume this is Hash MD5.

Thanks.
by rohit_mca2003
Wed Feb 22, 2017 8:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Algorithm used (by default) for Checksum Stage
Replies: 2
Views: 2192

Algorithm used (by default) for Checksum Stage

Hi,
Could you please advise which algorithm gets applied when we use CHECKSUM stage in DataStage 9.1? Is it 'Hash MD5' or anything else?

Thanks,
by rohit_mca2003
Wed Feb 22, 2017 8:23 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Generate multiple Checksum/SK in a generic job
Replies: 5
Views: 2274

Thanks for the replies. Even we are following similar approach. Putting a maximum number of Checksum. For each checksum we provide column/s using parameter. This job is RCP. At the end we Drop unnecessary checksum columns (again controlled by parameters for particular instance/value file). Thanks fo...
by rohit_mca2003
Sun Feb 19, 2017 10:28 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Generate multiple Checksum/SK in a generic job
Replies: 5
Views: 2274

Generate multiple Checksum/SK in a generic job

Hi, I have a requirement to use a generic job to read source file and load data into table. Each time I have a new source file, the corresponding target may have different number of Checksum/Surrogate Key columns. Example (Scenario 1): --------------------------- Source --> File 1 (Col1, Col2, Col3,...
by rohit_mca2003
Thu Jun 28, 2012 12:42 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading RAW Data from DB2
Replies: 6
Views: 6265

Reading RAW Data from DB2

Hi Everyone, Thanks for your valuable comments. This issue is resolved now, though we have to take different way to resolve. We were not able to use any function either ROWTOHEX or CAST. May be this is because of we are using (forcibly) ODBC EE stage. Job was aborting. To resolve this, I have stored...
by rohit_mca2003
Wed Jun 27, 2012 12:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading RAW Data from DB2
Replies: 6
Views: 6265

Reading RAW Data from DB2

Thanks Ray. Upon checking there is function available in DB2 RAWTOHEX but when I used this function in SQL then I am getting error with job abort. Process meta data not available in database Parallel job reports failure (code 139) When I checked DB2 table definition for RAW column, it is like below:...
by rohit_mca2003
Tue Jun 26, 2012 11:33 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Reading RAW Data from DB2
Replies: 6
Views: 6265

Reading RAW Data from DB2

Thanks for replies. When I am saying that non-readable then I mean to say that data in sequential file should be in ASCII character set and meaningful. Finally this data has to go to Data Warehouse and we can not store data in this format. Actually DB2 source tables have few field defined as RAW, ot...
by rohit_mca2003
Wed Nov 24, 2010 5:16 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Invalid/Error Data handling
Replies: 1
Views: 1166

Invalid/Error Data handling

Hi, I have a record which is having nearly 15 fields/columns. We have to do some data values/format/nullability checks on some 10 fields. We decided to do these checks in Transformer stage. Till here no problem. But we have to insert records in ERRORLOG table for each field validatin failure. Suppos...
by rohit_mca2003
Wed Nov 10, 2010 7:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: how to validate mandatory fields in XML
Replies: 1
Views: 2236

how to validate mandatory fields in XML

Hi, I am working with the XML files and I have imported the XSD document. XSD has information of all the mandatory and non mandatory fields. I want to have a check and trace while parsing XML in case XML document does not have mandatory field. In example below <body> <text> <id>123</id> <name>ABCD</...