Search found 150 matches
- Tue Sep 12, 2017 10:47 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: BigIntegrate Questions
- Replies: 1
- Views: 3482
BigIntegrate Questions
Hi We have BI 11.5 on Linux, Hortonworks Data platform. We are building a huge data lake in hadoop. I have few questions about how BI 11.5 works. I couldnt find answers to these in google. So could you please share some light. 1. Hive Queries - If i want to run any HIVE query on interactive mode usi...
- Tue Sep 12, 2017 10:34 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Loading data from HDFS file into HIVE table using Datastage
- Replies: 7
- Views: 11610
We use BigData Stage in a job to load data to HDFS and then use a script to create the HIVE table with correct partitions. We store data in /folder/structure/for_Hive/tableName/yyyy/mm/dd folder format and the HIVE tables are partitioned on Year, month and date. Both the loading HDFS and creating HI...
- Tue Apr 17, 2012 5:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Db2 Z loader - Issue
- Replies: 2
- Views: 1967
Hi I tried these, and looks like its working fine now. 1) I replaced the z-loader with the DB2 connector stage - this works fine. 2) I deleted the job and re-created the entire load job again. This time i used the same z-loader (DB2 bulk loading) stage. and the job is working fine. Not sure what hap...
- Tue Apr 17, 2012 1:48 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DB2 stored procedure
- Replies: 4
- Views: 3596
Hi We opened a ticket with IBM to fix this. And here is what we found: 1) the stored proc output had some low values (starting from position 172), so Datastage was truncating everything after that point. 2) the IBM representative said that datastage will not be able to handle these low values. 3) So...
- Tue Apr 17, 2012 1:44 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Db2 Z loader - Issue
- Replies: 2
- Views: 1967
Db2 Z loader - Issue
Hi I have a job that reads data from a data set (10 fields) and then passes thro transformer (im propagating only 4 fields to output), then i use a Z-loader stage (DB2 Bulk load) to load into the table. The job completes with Status = "Finished". I dont see any warnings or fatal errors, no...
- Mon Mar 05, 2012 8:27 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DB2 stored procedure
- Replies: 4
- Views: 3596
Hi Heres the part from copy book where the field is located: 10 DOL-CUST-COVG-DETAIL. 15 DOL-CUST-ACF-COVG-COUNT PIC 9(02). 15 DOL-CUST-ACF-COVG OCCURS 50 TIMES. 20 DOL-CUST-BEN-OPT PIC X(05). 20 DOL-CUST-CCF-PKG-TY PIC X(06). 20 DOL-CUST-FILL1 PIC X(09). 20 DOL-CUST-PRODT-TY PIC X(06). 20 DOL-CUST-...
- Mon Mar 05, 2012 7:48 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DB2 stored procedure
- Replies: 4
- Views: 3596
DB2 stored procedure
Hi I have a DB2 stored proc, and im passing a parameter (150 char length) and it returns an output of 29500 length. Also the guy who created the stored proc gave me a copy book (output format). In my job im using a stored proc stage to call this stored proc. Im passing the parameter manually 91 para...
- Sat Mar 03, 2012 7:39 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: CFF - Level 88
- Replies: 2
- Views: 2161
CFF - Level 88
Hi I have a DB2 stored procedure that i need to call from DataStage. It has 2 parameters (1 input - 150 char length and 1output - 29500 length). I checked the copy book and I see records with level 88. Also when I call the SP from DataStage, I can see only the first few characters (its the point bef...
- Tue Feb 14, 2012 12:50 pm
- Forum: General
- Topic: Datastage environment audit
- Replies: 0
- Views: 1422
Datastage environment audit
Hi We have multiple projects running in our datastage environment. Also we have both 7.5. & 8.0.1. We need to implement an audit mechanism to monitor the datastage environment. We are thinking about the following points to capture and implement. Could anybody suggest us a more points or is there...
- Thu Dec 29, 2011 5:59 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Short input record - Sequential file stage
- Replies: 2
- Views: 2290
- Thu Dec 29, 2011 5:56 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Short input record - Sequential file stage
- Replies: 2
- Views: 2290
Short input record - Sequential file stage
Hi I have a job that pulls data from s DB2 table and then joins it with a file. Then we do some transformation, and write it to a sequential file (using a sequential file stage). The job completes successfully (no warnings too.. :) ). But when i click on View Data to see the data in the file, im get...
- Fri Aug 20, 2010 6:31 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: gethostbyname(dcob04) returned null; node node1 cannot be
- Replies: 3
- Views: 2802
- Fri Aug 20, 2010 6:25 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Oracle to Oracle performance
- Replies: 7
- Views: 5306
- Thu Aug 19, 2010 6:07 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ORCHESTRATE
- Replies: 3
- Views: 2984
- Thu Aug 19, 2010 5:54 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: APT_DecimalNumber::convertToDecimal: the precision is not
- Replies: 5
- Views: 4314
Hi Thanks for the reply. I was using decimal(38,10) initially, but later on i changed it to decimal(38,38). But this didnt work either. SO i converted the result into varchar(42). Thats when i got this warning. No I converted both the columns into varchar from the source. and i did the calculation. ...