Search found 89 matches

by marpadga18
Tue Aug 07, 2012 5:49 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: replace quote character " " with ~#
Replies: 4
Views: 3004

Hi ray Itried this routine but when compileing it giving the following error. Compiling: Source = 'DSU_BP/DSU.ren', Object = 'DSU_BP.O/DSU.ren' 0002 #include "stdio.h" ^ Can't open $INCLUDE file 'DSU_BP/"stdio.h"' Is there anything else I need to do not much experience in C++ any...
by marpadga18
Tue Jul 10, 2012 4:44 am
Forum: General
Topic: Job running fine but in sequence aborts
Replies: 6
Views: 2833

Thanks jwles we were reading zip file instead of txt file.
by marpadga18
Mon Jul 09, 2012 1:46 pm
Forum: General
Topic: Job running fine but in sequence aborts
Replies: 6
Views: 2833

Re: Job running fine but in sequence aborts

samdsx wrote:Job parameters.
I am using job parameters but nothing is wrong here.. it is running fine when I run idividually but in sequence it is geeting aborted.
by marpadga18
Mon Jul 09, 2012 1:42 pm
Forum: General
Topic: Job running fine but in sequence aborts
Replies: 6
Views: 2833

Re: Job running fine but in sequence aborts

samdsx wrote:compare variables used by job when run from sequence and run individually and see if there is any difference.
Hi samdx

Do you mean user variables or enviornment variables?
by marpadga18
Mon Jul 09, 2012 1:33 pm
Forum: General
Topic: Job running fine but in sequence aborts
Replies: 6
Views: 2833

Job running fine but in sequence aborts

Hi All I am running a job which is running fine.but when I run the same job with sequence it is getting aborted due to this error. Error: Consumed more than 100000 bytes looking for record delimeter not found. My source file is a fixed width file. In the format I gave Record Delimeter String =DOS Fo...
by marpadga18
Fri Jul 06, 2012 2:55 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: timestamp to integer
Replies: 13
Views: 5277

chulett wrote:Therefore, the syntax is incorrect. :D
Hi Chutlet thanks for correcting me. What you said is right.
Thanks for all
by marpadga18
Thu Jul 05, 2012 2:45 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: timestamp to integer
Replies: 13
Views: 5277

Hi chutlet I tried in transformer with this syntax

in.column[1,4]:in.column[6,7]:in.column[9,10] when I validated this syntax it is valid I could not find any syntax errors if you could correct me that would be great.

Thanks M
by marpadga18
Thu Jul 05, 2012 2:15 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: timestamp to integer
Replies: 13
Views: 5277

hi chutlet

Is there nay way to do chutlet....I am trying to remove "-" so that it becomes integer but could you tell me how to acheive it.
Thanks M
by marpadga18
Thu Jul 05, 2012 1:32 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: timestamp to integer
Replies: 13
Views: 5277

timestamp to integer

hi I am having a source column as timestamp and my target column is integer. column 2011-07-29 00:00:00.000 in xfm I am doing in.column[1,4]:in.column[6,7]:in.column[9,10] but is is giving me 0 in the output I did this beofre it worked long back but now it is not working.could you tell me what is th...
by marpadga18
Thu Jul 05, 2012 7:45 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: sequential file reading with ~# delimiter
Replies: 6
Views: 2896

abc~#asdf~#hdsajds

these are the column names how to import the sqeuential file table definition. I tried with sequential file table definitions but not working.

Any ideas really helpfull
by marpadga18
Thu Jul 05, 2012 7:28 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: sequential file reading with ~# delimiter
Replies: 6
Views: 2896

sequential file reading with ~# delimiter

Hi
my data is like this


abc~#asdf~#hdsajds
the delmiter is ~# how do we read this I tried giving the delimiter as ~# but it is reading #also how to avoid this

Thanks
M
by marpadga18
Thu May 03, 2012 7:01 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Connector record count issue
Replies: 8
Views: 7195

Hi Williams thanks for your reply !! Unfortunately we can not touch triggers on that table but work around is keeping record count as 1.. then it is loading full data.... We dont have access to disable triggers so I am loading now with record count 1..... I will investigate on this if I get anything...
by marpadga18
Thu May 03, 2012 6:56 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Connector record count issue
Replies: 8
Views: 7195

chulett wrote:I'm sorry... you are using this "record count > 1" where? :?
If I give record count as 1 its running fine with odbc connector stage but if give from 2 it is loading partial data...
by marpadga18
Thu May 03, 2012 2:43 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Connector record count issue
Replies: 8
Views: 7195

Re: ODBC Connector record count issue

I have triggers on the table but I dnt have reject link it is dropping the records in target if give record count >1
by marpadga18
Wed May 02, 2012 8:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: ODBC Connector record count issue
Replies: 8
Views: 7195

ODBC Connector record count issue

Hi, I am facing problem while loading data from source sql server - target sql server. stage using odbc connector If I change record count to >1 then it is not loading full data it is loading only some records but if I change record count>1 then it is loading full data. I am not able to understand a...