Search found 86 matches

by michaeld
Thu Apr 16, 2009 3:02 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SCD stage
Replies: 10
Views: 5406

I just answered my own question. It is a bug in the SCD stage: even thought the dimension table is not displayed in the source column pane you can still access it through transformer script. example: if isNull(out_odbc_Products.PRODUCT_SK) then NextSurrogateKey() else out_odbc_Products.PRODUCT_SK It...
by michaeld
Thu Apr 16, 2009 1:41 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SCD stage
Replies: 10
Views: 5406

SCD stage

I'm trying to use the SCD stage to implement SCD2 logic. The issue I have is that a new surrogate key is generated for both updated records and new records. That is if a records is updated then the result is two record in the dimension update link: 1) entry to expire the old record with the SK and E...
by michaeld
Tue Apr 14, 2009 1:18 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: SCD Fatal error- Index too large
Replies: 5
Views: 3844

Are you using parameters for the surrogate key connection information? username / password... if yes then hardcode them and see if it works. It should work. "index too large" is misleading. The real issue is 1) that the SCD stage can not access the oracle sequence object this is most likel...
by michaeld
Tue Apr 14, 2009 12:44 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Suroggate key generator error
Replies: 5
Views: 2392

The issue is that your sequence is not defined with a BIG INT data type (in DB2). By default DB2 will assign a sequent to INT. You need to set it to BIGINT in order for it to work with DataStage. CREATE SEQUENCE ACTNO_SEQ AS BIGINT START WITH 1 INCREMENT BY 1 NOMAXVALUE NOCYCLE CACHE 10; no charge:-)
by michaeld
Mon Apr 13, 2009 8:31 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: transformer compilation error
Replies: 12
Views: 5697

The problem with VC2008 express is that the cxx compiler doesn't know where the libraries are. So if you update the LIB and INCLUDE path then it will work. You do not need to reinstall anything.
by michaeld
Mon Mar 30, 2009 1:21 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parameter set used in Transformer = Compilation error
Replies: 13
Views: 4246

parameter set with $PROJDEF not working in seq file stage

I have a parameter set with a project parameter ppSourceDir which is set to $PROJDEF. If I use it in a seq file stage to point to the source file then it does not resove the $PROJDEF value from DSParams. However if I click on VIEW DATA then it does resolve it. example: source #paramset.$ppSourceDir#...
by michaeld
Tue Dec 16, 2008 6:19 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Getting count of records passed through a link
Replies: 9
Views: 4095

However it gives right values for both server and parallel jobs when we test the routine in DS Manager.
If the problem is that you're not passing an active stage name then it wouldn't work when running from DS manager. Or did you test other jobs in DS Manager, but not this one?
by michaeld
Tue Dec 16, 2008 10:50 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Getting count of records passed through a link
Replies: 9
Views: 4095

does the job you're trying to get a link countr from run as a multi instance job? if yes then you have to include the instance in the job name.

job.instance
by michaeld
Mon Oct 06, 2008 1:20 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: working the Mainframe data
Replies: 3
Views: 1516

I don't think you can read VSAM files directly in PX (I think DataStage MVS has this functionalty). Once you convert them to flat data sets (mainframe) or files then you can treat them the same as any other EBCDIC file. They will most likely be variable block datasets to you'll need to flatten them ...
by michaeld
Thu Sep 11, 2008 12:07 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Transformer warning
Replies: 4
Views: 1740

The other thing you can do is insert a COPY stage just before your transforer and set the partitioning there. In the transforer just leave it as default(AUTO). This worked for me. I hate DataStage, but at least it pays my mortgage:-)
by michaeld
Wed Aug 06, 2008 10:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Compiler error Transformer job
Replies: 4
Views: 2318

I've had this problem. Do you have a lot of columns?
too many exception handler states in function
Try to split the job in to multiple streams and then combine them.
by michaeld
Fri May 09, 2008 12:46 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex File Stage: variable block files
Replies: 8
Views: 4421

ok, so I tried implementing what I listed above and it works except that I can't seem to load a file in blocks when the file size is not divisible by the block size. I'm using a sequential file stage with record-type=implicit and field delimiter=none. Eaxmple: 5 bytes of data in file, 2 byte field s...
by michaeld
Fri May 09, 2008 9:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex File Stage: variable block files
Replies: 8
Views: 4421

We are dealing with fixed width records so there is no record delimiter. the problem is that we do not know how big the record is until we read it. So the alignment will get messed up. For example Record type A: 5 bytes Record type B: 3 bytes Position 1: identifier of record type sample data with 5 ...
by michaeld
Thu May 08, 2008 12:38 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex File Stage: variable block files
Replies: 8
Views: 4421

We can not modify the source. It is a shared among many legacy applications. We're on a tight timeline so using FILE-AID seem like the best solution for now. Thanks for you help though.
by michaeld
Wed May 07, 2008 8:02 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Complex File Stage: variable block files
Replies: 8
Views: 4421

I wish it did, but it doesn't. If I merge two copy books together and import them then it says that the record lenght of one does not match the record lenght of the other. Each record definition is in a separate copy book. They all have different record lengths. The CFF stage in DataStage version 8 ...