The input sequential file has both insert and delete records. These records are of variable length and fields are PIPE (|) delimited. The first field of the record indicates if the record is a delete record or insert/update record. I know separating the records using an awk script and I don't want t...
May be, but this is how i'm running the jobs from a unix shell. dsjob -file ${file} ${servername} -run -warn ${WARNING_LIMIT} -jobstatus -local ${PROJECT_NAME} ${JOB_NAME} In my previous post I forgot to mention about using -local option. -local option is required if your Job needs to take environme...
Hi Kris, Thanks for the quick reply. Here is the requirement/process of the job. We have input file which have new inserts and updates to existing database. I do a join between source file and target db and create a hashed file. Then hashed file is either inserted with new records or updated with th...
Hi,
Is there way to modify the hashed file key. When i tried to do this using a transformer, it inserted a new record into the hashed file with new key.
Btw, I'm creating the static hashed file from the command prompt using the command
mkdbfile filename 2 modulo 8 -32BIT
Hi, I have a simple parallel job which reads from a sequential file and loads into a oracle table. The job is aborting with following errors. Project:cvid (etlprd2) Job name:CvBT0002Job5SsaStageLoad_parllel.2 Event #:433 Timestamp:3/8/2006 1:00:10 PM Event type: Warning User:cvid Message: Oracle_Ent...
We have a server job which reads records from a Sequential file and loads the good (transformer does the validations) records into oracle table. Apart from loading into oracle table, we also write all the good records (1015 bytes) into one Sequential file. This sequential file (good record) is used ...
Ray, Thanks for your suggestions, The reason for going to static hashed file is because of my source file. my source file varies in size (abt 8,000 records to 5,500,000). And the record size is 900 bytes. for 5.5 million records file i need to create 64bit hashed file as size of hashed file will be ...
Arnd, Thanks for your quick reply. Btw some times my key (Telephone number) have alphabets in middle, i think this should not be problem for using type 2. I got two more questions. First one is Can I still use type 2 for 64 bit hashed files as well? My other quesition is finding the modulo for creat...
We have been using dynamic hashed files (default t30) as part of our datastage jobs. Till now this did not create any issue. Now we are expecting huge data to be processed. The current dynamic files does not seem to meet the requirement. So we decided to go with Static hashed files and the results h...