Search found 233 matches
- Fri Mar 14, 2008 11:13 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Trnasformer issue
- Replies: 4
- Views: 1492
Re: Trnasformer issue
Ok..."i have a job like dataset--->transformer--->flatfile." (sorry, had to get that out of the way) i cannot understand this. Why not? No data in equals no data out. Because the output field is hardccoded(independent of input data) and when i run the job it should produce H irrespective ...
- Thu Mar 13, 2008 7:49 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Trnasformer issue
- Replies: 4
- Views: 1492
Trnasformer issue
Lets say i have a job like dataset--->transformer--->flatfile. In Transformer, input contains all fields from dataset and output of transformer contains only one filed name COL1 for which the derivation is harcoded to 'H'. (this is a sample job. in this we are not using the data from dataset). Now w...
- Tue Mar 11, 2008 5:58 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Import Jobs
- Replies: 11
- Views: 2681
- Tue Mar 11, 2008 11:44 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Problem with ORCHESTRATE
- Replies: 2
- Views: 1025
Problem with ORCHESTRATE
Lookup Table is populated with data even after we truncate the table in Oracle. Following is the user defined SQL in the Lookup table - SELECT Trim(EMPLID) as EMPLID,Trim(PERNR) as PERNR,TRIM(COMPANY_CODE) AS COMPANY_CODE,Trim(RFPRNR) as RFPRNR FROM #$SCE_SSA_SCHEMA#.#JPM_MSTR_PERNR# WHERE Trim(EMPL...
- Tue Mar 11, 2008 11:02 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Import Jobs
- Replies: 11
- Views: 2681
It takes a certain amount of experience to interpret the LIST.READU command output and it is not worth the time to do this. The damage you can do in a running environment with kill and UNLOCK using information gleaned from the LIST.READU and PORT.STATUS output is substantial. In additional to the k...
- Mon Mar 10, 2008 7:59 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Import Jobs
- Replies: 11
- Views: 2681
Import Jobs
When i tried to import job through manager to TEST environment, i got the below error.
"Cannot get exclusive access to job 'jobname'".
How can i rectfy this. Can you please let me know whats happening in the backend
"Cannot get exclusive access to job 'jobname'".
How can i rectfy this. Can you please let me know whats happening in the backend
- Mon Mar 10, 2008 2:11 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Not enough room in ustring for decimal[13,2].
- Replies: 3
- Views: 3299
Not enough room in ustring for decimal[13,2].
char / -00000044215.22
source and target using decimal 13 2
source and target using decimal 13 2
- Mon Mar 10, 2008 12:43 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Not enough room in ustring for decimal[13,2].
- Replies: 3
- Views: 3299
Not enough room in ustring for decimal[13,2].
Project:CN_7-FI_N1_QA (EDRP001) Wodetitd_Wodettrn_T10_ldr: Error when checking operator: When binding input interface field "OVRHD_DISTRIB_XCL_AMT" to field "OVRHD_DISTRIB_XCL_AMT": Implicit conversion from source type "decimal[13,2]" to result type "ustring[max=13...
- Mon Mar 10, 2008 11:51 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Sorting, Partitoning
- Replies: 3
- Views: 1944
Sorting, Partitoning
If the job is flowing with hash partition, and in between if i insert a sort, will the hash partition will flow to the output stage of sort.
Does the sort stage retains the partition done before. Or else i have to partition again after a sort stage.
Does the sort stage retains the partition done before. Or else i have to partition again after a sort stage.
- Thu Mar 06, 2008 7:15 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job control process (pid 2633746) has failed
- Replies: 3
- Views: 1431
- Thu Mar 06, 2008 3:53 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Job control process (pid 2633746) has failed
- Replies: 3
- Views: 1431
Job control process (pid 2633746) has failed
Job control process (pid 2633746) has failed--throw a warning
and job aborts due to above reason,can any one help us what exact reason job aborts
and job aborts due to above reason,can any one help us what exact reason job aborts
- Tue Mar 04, 2008 12:26 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Parallel Jobs not compiling/running
- Replies: 4
- Views: 1414
- Tue Mar 04, 2008 11:56 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Heap Size errror in Transformer
- Replies: 13
- Views: 4286
Then please post the actual errors (not the ones from APT_CombinedOperatorController). The score would be used to identify whether DataStage had inserted any tsort or buffer operators. Had it? Thanks for your reply Below i have pasted the Info and error in the order as shown in director. Also i hav...
- Mon Mar 03, 2008 8:19 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Heap Size errror in Transformer
- Replies: 13
- Views: 4286
Capture and inspect the score - the script that is actually executed, rather than the generated osh. I have used dump_score, Pm_player_memory, pm_player_timing. Dumpscore says 44 process runs on 2 nodes. Heap size was good at other stages. but when it came to transformer, th initial info is Xfm_Fsd...
- Mon Mar 03, 2008 1:48 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Heap Size errror in Transformer
- Replies: 13
- Views: 4286
Disable operator combination so that you can find out where the error is occurring. Help us to help you. We simply can not (or at least will not) diagnose errors thrown by an arbitrary number of stages combined into the one process. I did as you said. I have disabled the combinality property in all...