Search found 76 matches

by georgesebastian
Fri Apr 27, 2007 12:06 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Error while Hash file creation
Replies: 6
Views: 4413

Re: Error while Hash file creation

Hi All, When we try to create hash files , the job is aborting with the following error message : " GSAP_LOOKUP_CreateLkpForRegJobs..XrefSctyOvrdHash.ToBuildSctyOvrdHash: DSD.UVOpen The system cannot move the file to a different disk drive. (17): G:\Ascential\DataStage\Projects\DEV_ZGP0GSAS\mk...
by georgesebastian
Tue Apr 24, 2007 5:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error while running job
Replies: 3
Views: 781

Re: Error while running job

Hi, My job layout Source-> Transformer-> Target. I have 1:1 mapping for maximum columns(from source to target) and for other columns I am passing default value(Empty String for Varchar/Char n 0 for int n dec). Both source and target column length are Proper( In max cases both are same n in some cas...
by georgesebastian
Tue Apr 24, 2007 3:27 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reference Link with mutli row result set
Replies: 16
Views: 4728

nick.bond wrote:I'm sure this has been covered before - have a search for it......
Hi Nick,

I searched for it.
But couldnt find much useful information regarding this one..
If you know any specific thread,can you please post that one.

thanks
George
by georgesebastian
Tue Apr 24, 2007 2:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Reference Link with mutli row result set
Replies: 16
Views: 4728

Reference Link with mutli row result set

Hi Everyone,

Reference Link with mutli row result set - can anyone tell me how can you set this in server jobs?

Thanks
George
by georgesebastian
Mon Apr 23, 2007 10:28 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Where did I go wrong?
Replies: 20
Views: 5963

Re: Where did I go wrong?

Hi, I know this issue has been talked about a lot. I haven't had to do this since a long time. The issue is range lookups and let me share what I have done so far. MyRefJob ----------- This hashed file job creates a hashed file. Before populating the hashed file, I created the indexes SEPARATELY on...
by georgesebastian
Sun Apr 22, 2007 11:24 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to achieve this output?
Replies: 1
Views: 569

How to achieve this output?

Hi everyone, I have a scenario The input seq file is having 1000 rows I have 3 output files.I need the output like ... 1st output file should contain (input file rows) 1-10,31-40,61-70 etc 2nd output file should contain (input file rows) 11-20,41-50,71-80 etc 3rd output file should contain (input fi...
by georgesebastian
Sun Apr 22, 2007 9:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Creating Rownum
Replies: 6
Views: 2000

Hi Pravin,

I think you need to first sort the data and then use stage variables here as well as the routine rowproc comparewithprevious value.

Thanks
George
by georgesebastian
Thu Apr 19, 2007 5:59 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Transfmer without an input link?
Replies: 8
Views: 2597

Transfmer without an input link?

Hi Everyone,

Will transfmor work with out an input link?
I know 1 way using stage variable and using constraint
Like @inrownum <1

Heard there are other ways can any one tell me?
by georgesebastian
Sat Apr 14, 2007 4:17 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to Split Row into columns through Data Stage?
Replies: 6
Views: 5719

Why? A Pivot stage will do exactly what you require. With a hashed file, the technique for a horizontal pivot is to load the non-pivoting columns in a value mark delimited dynamic array and to U ... Sorry Ray, I am not a premium member so i was not able to read your posting fully. Sure i will be a ...
by georgesebastian
Fri Apr 13, 2007 10:50 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: "Error selecting from log file RT_LOG862"
Replies: 9
Views: 7409

ray.wurlod wrote:And so can you mark the thread as resolved?
Hi Ray,

I dont think i can mark this as resolved since i am not the one who opened the thread.I think MORIS can only do this


Thanks
George
by georgesebastian
Fri Apr 13, 2007 6:25 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: "Error selecting from log file RT_LOG862"
Replies: 9
Views: 7409

What happened when you ran Exact Match searches for DS.TOOLS and DS.CHECKER ? I got quite a number of hits which answered both questions. Please give that a shot and then let us know if you have any specific questions on either subject. Sorry,It was my fault. When i tried searching both individuall...
by georgesebastian
Fri Apr 13, 2007 3:49 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to Split Row into columns through Data Stage?
Replies: 6
Views: 5719

Welcome, there are several ways to do that depending on the kind of sourcefile. for example Pivot-Stage or Hash Table or or or .... Start searching in the Forum and if you don't find an answer to your problem, please let us know Hi Wolfgang, By Hash Table do you mean a hash File?If so how can it do...
by georgesebastian
Thu Apr 12, 2007 12:19 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Unable to open designer from the director
Replies: 2
Views: 2008

Hi Prasanna ,

Seems to be a weird error.
Can you open the Director through the designer?

Thanks
George
by georgesebastian
Mon Apr 02, 2007 12:01 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: WHich is faster?
Replies: 9
Views: 2470

Hi Everyone,

That really was very informative.Thanks all.This is really a fantasic forum.
I love this.

Thanks
George
:D
by georgesebastian
Mon Apr 02, 2007 12:01 am
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: WHich is faster?
Replies: 9
Views: 2470

Hi Everyone,

That really was very infomative.Thanks all.This is really a fantasic forum.
I love this.

Thanks
George
:D