Search found 86 matches
- Thu Apr 16, 2009 3:02 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: SCD stage
- Replies: 10
- Views: 5406
I just answered my own question. It is a bug in the SCD stage: even thought the dimension table is not displayed in the source column pane you can still access it through transformer script. example: if isNull(out_odbc_Products.PRODUCT_SK) then NextSurrogateKey() else out_odbc_Products.PRODUCT_SK It...
- Thu Apr 16, 2009 1:41 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: SCD stage
- Replies: 10
- Views: 5406
SCD stage
I'm trying to use the SCD stage to implement SCD2 logic. The issue I have is that a new surrogate key is generated for both updated records and new records. That is if a records is updated then the result is two record in the dimension update link: 1) entry to expire the old record with the SK and E...
- Tue Apr 14, 2009 1:18 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: SCD Fatal error- Index too large
- Replies: 5
- Views: 3844
- Tue Apr 14, 2009 12:44 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Suroggate key generator error
- Replies: 5
- Views: 2392
The issue is that your sequence is not defined with a BIG INT data type (in DB2). By default DB2 will assign a sequent to INT. You need to set it to BIGINT in order for it to work with DataStage. CREATE SEQUENCE ACTNO_SEQ AS BIGINT START WITH 1 INCREMENT BY 1 NOMAXVALUE NOCYCLE CACHE 10; no charge:-)
- Mon Apr 13, 2009 8:31 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: transformer compilation error
- Replies: 12
- Views: 5697
- Mon Mar 30, 2009 1:21 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Parameter set used in Transformer = Compilation error
- Replies: 13
- Views: 4246
parameter set with $PROJDEF not working in seq file stage
I have a parameter set with a project parameter ppSourceDir which is set to $PROJDEF. If I use it in a seq file stage to point to the source file then it does not resove the $PROJDEF value from DSParams. However if I click on VIEW DATA then it does resolve it. example: source #paramset.$ppSourceDir#...
- Tue Dec 16, 2008 6:19 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Getting count of records passed through a link
- Replies: 9
- Views: 4095
- Tue Dec 16, 2008 10:50 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Getting count of records passed through a link
- Replies: 9
- Views: 4095
- Mon Oct 06, 2008 1:20 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: working the Mainframe data
- Replies: 3
- Views: 1516
I don't think you can read VSAM files directly in PX (I think DataStage MVS has this functionalty). Once you convert them to flat data sets (mainframe) or files then you can treat them the same as any other EBCDIC file. They will most likely be variable block datasets to you'll need to flatten them ...
- Thu Sep 11, 2008 12:07 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Transformer warning
- Replies: 4
- Views: 1740
- Wed Aug 06, 2008 10:25 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Compiler error Transformer job
- Replies: 4
- Views: 2318
- Fri May 09, 2008 12:46 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex File Stage: variable block files
- Replies: 8
- Views: 4421
ok, so I tried implementing what I listed above and it works except that I can't seem to load a file in blocks when the file size is not divisible by the block size. I'm using a sequential file stage with record-type=implicit and field delimiter=none. Eaxmple: 5 bytes of data in file, 2 byte field s...
- Fri May 09, 2008 9:57 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex File Stage: variable block files
- Replies: 8
- Views: 4421
We are dealing with fixed width records so there is no record delimiter. the problem is that we do not know how big the record is until we read it. So the alignment will get messed up. For example Record type A: 5 bytes Record type B: 3 bytes Position 1: identifier of record type sample data with 5 ...
- Thu May 08, 2008 12:38 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex File Stage: variable block files
- Replies: 8
- Views: 4421
- Wed May 07, 2008 8:02 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Complex File Stage: variable block files
- Replies: 8
- Views: 4421
I wish it did, but it doesn't. If I merge two copy books together and import them then it says that the record lenght of one does not match the record lenght of the other. Each record definition is in a separate copy book. They all have different record lengths. The CFF stage in DataStage version 8 ...