Search found 92 matches

by dsxdev
Tue Dec 14, 2004 6:03 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: how to compile jobs thru unix commands and which one to use.
Replies: 6
Views: 2952

My requirement is to compile a job which is in aborted state and not all.
This work I want to do daily.
Like it is a job which would compile all aborted jobs.

Any routines already available/or macros in DS would be agreat help
by dsxdev
Fri Dec 10, 2004 7:06 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Checkpoints (or savepoints) in a job
Replies: 3
Views: 2426

In PX the records are not read in order there is crossing over of data between partitions. How is this handled when we restart the job again. Suppose 9050 records were read from input and output how do we identify which 9050 records were read and output. This really puzzles and the Row Commit and Ti...
by dsxdev
Thu Dec 09, 2004 7:31 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: how to compile jobs thru unix commands and which one to use.
Replies: 6
Views: 2952

how to compile jobs thru unix commands and which one to use.

Are there any unix commands to compile Ds jobs. If so can some please tell.

somthing like dscompile.... etc
by dsxdev
Thu Nov 18, 2004 6:15 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Defining Constraints in Transformer Stage
Replies: 13
Views: 15994

Hi,

If you are really dealing with a char type column then you cannot trim a char type column,even if you trim a char type column value remains same, padding will still be there. For such column convert it to varchar and then use

Code: Select all

TrimLeadingTrailing(inputcolumn)="Value"
by dsxdev
Thu Nov 18, 2004 5:45 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Performance issue with Funnel Stage
Replies: 2
Views: 1821

Hi
There have been sevaral performance issues with Funnel Stage. It reduces performance to some extent.
Check with Ascential there have been many patches released.
by dsxdev
Thu Nov 18, 2004 5:14 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: reading the records from thesored dataset using transformer
Replies: 9
Views: 5420

hi, There is a simpler solution if your are ready to forgone a bit on performance side and change job design. Your job looks like this. Dataset ->Sort stage(sort keys:date and Store, options: create Key Change column=true; preserve partitioning=set)-> Transformer(run in sequential mode)->output In t...
by dsxdev
Wed Nov 10, 2004 10:02 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: split city and state data from one filed into two fields
Replies: 8
Views: 2236

A simple solution would be use a Column Export Stage which can export/ split a single column into multiple columns.
But the column to be split should have a format( fixed with or delimited).
by dsxdev
Wed Nov 10, 2004 9:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Run px on windows
Replies: 7
Views: 3935

px for windows has not come.
If it comes it can be installed on a OS which supports multple cpu architeture only.
by dsxdev
Wed Nov 10, 2004 9:56 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: JOb Sequencer Error
Replies: 7
Views: 3325

Hi
DataStage 7.1 forward you have Break points avalable in job sequencers. which will do exactly what you need.

if you are working on earlier versions then you have to handle the situation explicitly by coding it.
by dsxdev
Wed Nov 10, 2004 9:52 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Export -- xml vs .dsx format
Replies: 15
Views: 28242

Hi
Though a .xml file is larger than .dsx file it is much easier to read and go through. A .xml export of a DataStage job can be easily formated and is more readable.

In also has the advantage of integrating the code and metedata into some other code for parsing. This is not possible with .dsx file.
by dsxdev
Tue Nov 02, 2004 9:35 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error viewing flat file in Px
Replies: 6
Views: 5756

Hi Once you fix DSProjectMapName check these. This is a warning first ensure the below still problem persists then think of DSProjectMapName. I hope there is some problem in the column PERSON_NO which is decimal type but length is not specified. You specified precision but not scale. Scale is defaul...
by dsxdev
Fri Oct 29, 2004 9:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Parallel Transformer stage errors
Replies: 3
Views: 2242

This is a generic error message you get with a return code if you go down the error messages you'll find the line no where ther is error.
by dsxdev
Thu Oct 28, 2004 9:24 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: LEFT OUTER JOINS
Replies: 2
Views: 2066

In a DB2 API Stage you can write your query. if the query works fine at DB2 prompt it would work fine in the job also. Another solution could be using a Filter Stage in conjunction wiht join Stage. Do a left outer join and send the output of join to a filter Stage wher the condition would be set as ...
by dsxdev
Thu Oct 28, 2004 9:20 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Input buffer overrun at field " DATE1" at offset 3
Replies: 3
Views: 19343

You can do one thing read the column as varchar and then handle the date conversion explicitely in a transformer.
by dsxdev
Thu Oct 28, 2004 9:19 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Input buffer overrun at field " DATE1" at offset 3
Replies: 3
Views: 19343

The problem is with wrong metadat or the datatype when you defined the column type as timestamp and the data you are getting is not time stamp. When you are reading the data DataStage trying to read 19 char fro the time Stamp field but it is getting short data do it consumes data from next column an...