Search found 24 matches
- Thu Nov 09, 2006 3:19 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: How to Collect Statistics
- Replies: 6
- Views: 4994
- Thu Nov 09, 2006 3:16 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Errors while loading the sequential files to sql server
- Replies: 8
- Views: 2126
import error
check weather there are '|' in the record.make sure your record dont have any'|' with in the record excluding the delimiters
- Mon Jul 24, 2006 12:54 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: passing a value in parameter
- Replies: 3
- Views: 1612
passing a value in parameter
hai I would like to pass a value in JOblevel how can i do that Let me explain I have a parameter in joblevel as SEQ_NO i put this as a default value of 1 When i run a job it inserted 20 records with the seq_no as 20 Now when I run the job again my seq_no in parameter is still 1 .It should become(def...
- Mon Jul 24, 2006 8:44 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: xml files
- Replies: 5
- Views: 2002
- Mon Jul 24, 2006 8:30 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: xml files
- Replies: 5
- Views: 2002
xml files
I have n number of Xml files ... and I need to read those xml files one by one(all have same metadata) and then do some transformation for the logic My problem here in parallel is I don,t have FOlder stage to do.How can I implement this in parallel Anyone have an idea abt this THis will help me a lot
- Fri Jul 21, 2006 11:15 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: folder stage
- Replies: 1
- Views: 1113
folder stage
is there any stage in Parallel extender that can perform the folder stage
in server..
i couldnot find one
like if any one would help on this
becas I want to read a lot of files in a particular folder and write into a table
i just wann know how we can do this in parallel
in server..
i couldnot find one
like if any one would help on this
becas I want to read a lot of files in a particular folder and write into a table
i just wann know how we can do this in parallel
- Wed Jul 12, 2006 9:13 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: change capture
- Replies: 1
- Views: 1240
change capture
Hello when I am trying to capture data I am getting the following warnings for each record I am passing through I need help in this regard Chg_Cap_ISRT_UPDT: When checking operator: Defaulting "EARNINGTYPEID" in transfer from "beforeRec" to "outputRec" and also giving o...
- Fri May 19, 2006 3:14 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: hash file problem
- Replies: 9
- Views: 2938
- Fri May 19, 2006 3:01 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: hash file problem
- Replies: 9
- Views: 2938
Thanks for all of the tips, I have tried all those options. and still i am getting the same problem. I am not able to trace out what the problem is ? the thing i am confused is these are people soft delivered jobs. should the SQl query effects the hash file I got no clue Help in this matter is appre...
- Fri May 19, 2006 12:21 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: hash file problem
- Replies: 9
- Views: 2938
hash file problem
Thanks cris for replying. The job is running fine when i keep db stage as lkp.But when i want to keep a hash file instead of db stage its not picking up records. i copied the same metadata as of db stage i donno whats happening [quote="kris007"]Is the job running fine or are you getting an...
- Fri May 19, 2006 10:23 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: hash file problem
- Replies: 9
- Views: 2938
hash file problem
Hey I am using Hash File as Lkp. The problem is the hash file has the data ans the Transforemer is not able to read any records from hash file When I keep the database lkp Transformer stage is taking data. I dont understand whats happening.The thing is it ran fine when i usede hash file for the firs...
- Thu Feb 23, 2006 8:32 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: integration testing
- Replies: 2
- Views: 1160
integration testing
How to do integration testing
using Control jobs
is there any proper way of keeping the jobs to run in ctrl jobs
help will be appreciated
using Control jobs
is there any proper way of keeping the jobs to run in ctrl jobs
help will be appreciated
- Tue Feb 21, 2006 12:36 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Finding a table name with constraint name
- Replies: 2
- Views: 1246
Finding a table name with constraint name
Hai I am getting some problem
its giving Integrity constraint error
DOes any one know
how to find a particular table name with the constraint Name
Help will be appreciated
Thanks in advance
its giving Integrity constraint error
DOes any one know
how to find a particular table name with the constraint Name
Help will be appreciated
Thanks in advance
- Wed Feb 15, 2006 12:24 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: staging
- Replies: 10
- Views: 3639
Naveen thanks for telling it my source is siebel and target is oracle i am doing dump but i am not able to dump entire records from siebel i have 76 records in db but ds is picking only 68 records and dump to oracle. As far as my knowledge there are no duplicates and the constraint i used id update_...
- Tue Feb 14, 2006 2:12 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: staging
- Replies: 10
- Views: 3639
staging
hey Arnd,
my job log shows 68 records. and the thing is there are no duplicate records to overwrite
and the key is the ROW_ID
I AM NOT ABLE TO UNDERSTAND WHY THIS IS HAPPENING
thanks for the reply
my job log shows 68 records. and the thing is there are no duplicate records to overwrite
and the key is the ROW_ID
I AM NOT ABLE TO UNDERSTAND WHY THIS IS HAPPENING
thanks for the reply