Hi,
How to find out which job is creating a particular hash file?
Thanks.
Search found 102 matches
- Wed May 16, 2007 8:21 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: hash file
- Replies: 1
- Views: 547
- Wed Apr 18, 2007 2:06 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: look up
- Replies: 8
- Views: 2049
look up
In my job design i have one source(hash file) and 1 lookups( hash file). When i find a value in lookup as A1 then i need to concatenate the value coming from source with 1 and hook that value When i find a value in lookup as A2 then i need to concatenate the value coming from source with 2 and hook ...
- Fri Apr 06, 2007 9:46 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: trim
- Replies: 9
- Views: 2428
Re: trim
I AM GETTING DATA FROM MAINFRAME AND WRITING DATA TO SEQUENTIAL FILE. WHEN I AM WRITING DATA TO SEQ FILE I AM GETTING SOME SPACES AT THE END. SO I USED A TRIMFA(TRIMB(FIELD)) BUT STILL I HAVE THE SAME PROB. I TRIED ALL OPTIONS WITH TRIM. EXAMPLE. 213107850 ANY IDEAS. THANKS IN ADVANCE. Only for rea...
- Fri Apr 06, 2007 9:39 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: trim
- Replies: 9
- Views: 2428
Re: trim
I AM GETTING DATA FROM MAINFRAME AND WRITING DATA TO SEQUENTIAL FILE. WHEN I AM WRITING DATA TO SEQ FILE I AM GETTING SOME SPACES AT THE END. SO I USED A TRIMFA(TRIMB(FIELD)) BUT STILL I HAVE THE SAME PROB. I TRIED ALL OPTIONS WITH TRIM. EXAMPLE. 213107850 ANY IDEAS. THANKS IN ADVANCE. THERE ARE SP...
- Fri Apr 06, 2007 9:27 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: trim
- Replies: 9
- Views: 2428
Re: trim
I AM GETTING DATA FROM MAINFRAME AND WRITING DATA TO SEQUENTIAL FILE. WHEN I AM WRITING DATA TO SEQ FILE I AM GETTING SOME SPACES AT THE END. SO I USED A TRIMFA(TRIMB(FIELD)) BUT STILL I HAVE THE SAME PROB. I TRIED ALL OPTIONS WITH TRIM. EXAMPLE. 213107850 ANY IDEAS. THANKS IN ADVANCE. THERE ARE BO...
- Fri Apr 06, 2007 9:20 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: trim
- Replies: 9
- Views: 2428
Re: trim
I AM GETTING DATA FROM MAINFRAME AND WRITING DATA TO SEQUENTIAL FILE. WHEN I AM WRITING DATA TO SEQ FILE I AM GETTING SOME SPACES AT THE END. SO I USED A TRIMFA(TRIMB(FIELD)) BUT STILL I HAVE THE SAME PROB. I TRIED ALL OPTIONS WITH TRIM. EXAMPLE. 213107850 ANY IDEAS. THANKS IN ADVANCE. SQL TYPE IS ...
- Fri Apr 06, 2007 9:14 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: trim
- Replies: 9
- Views: 2428
trim
I AM GETTING DATA FROM MAINFRAME AND WRITING DATA TO SEQUENTIAL FILE.
WHEN I AM WRITING DATA TO SEQ FILE I AM GETTING SOME SPACES AT THE END.
SO I USED A TRIMFA(TRIMB(FIELD)) BUT STILL I HAVE THE SAME PROB.
I TRIED ALL OPTIONS WITH TRIM.
EXAMPLE.
213107850
ANY IDEAS.
THANKS IN ADVANCE.
WHEN I AM WRITING DATA TO SEQ FILE I AM GETTING SOME SPACES AT THE END.
SO I USED A TRIMFA(TRIMB(FIELD)) BUT STILL I HAVE THE SAME PROB.
I TRIED ALL OPTIONS WITH TRIM.
EXAMPLE.
213107850
ANY IDEAS.
THANKS IN ADVANCE.
- Thu Jan 04, 2007 3:15 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: warning
- Replies: 3
- Views: 1050
warning
Hi, when i am performing a lookup i am getting this error, job design is odbc | odbc----->lkp------>target I am getting the following error: LKP,0: When binding input interface field "xyz" to field "xyz": Converting nullable source to non-nullable result; fatal runtime error coul...
- Wed Jan 03, 2007 3:31 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: No nodes with disk in "export"
- Replies: 2
- Views: 1681
No nodes with disk in "export"
HI,
when i am writing data to file set, i am getting the following warning
"when checking operator: No nodes with disk in "export" resource pool; using default disk pool instead.
Any idea how to solve this.
Thanks in advance
when i am writing data to file set, i am getting the following warning
"when checking operator: No nodes with disk in "export" resource pool; using default disk pool instead.
Any idea how to solve this.
Thanks in advance
- Fri Dec 29, 2006 5:37 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Exporting nullable field
- Replies: 2
- Views: 1529
Exporting nullable field
HI,
when i am trying to write data to a sequential file i am gettinng warnings.
"Exporting nullable field without null handling properties"
Any suggestions.
Thanks.
when i am trying to write data to a sequential file i am gettinng warnings.
"Exporting nullable field without null handling properties"
Any suggestions.
Thanks.
- Wed Dec 20, 2006 10:32 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: select privileges
- Replies: 7
- Views: 3343
select privileges
Hi,
when i am using a oracle enterprise stage i am getting the following error:
"Access to sys.dba_extents required but not available. please see the your dba for select privileges"
how to resolve this issue.
thanks in advance
when i am using a oracle enterprise stage i am getting the following error:
"Access to sys.dba_extents required but not available. please see the your dba for select privileges"
how to resolve this issue.
thanks in advance
- Thu Nov 30, 2006 1:29 pm
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: xls
- Replies: 1
- Views: 627
xls
Hi,
Source data contains 007 when i write to a xls file i am getting the same value.
But when i mail the file as an attachment. when i open the attachment i am able to see only 7 but not 007.
I want to retain the 00.
Any ideas.
Thanks.
Source data contains 007 when i write to a xls file i am getting the same value.
But when i mail the file as an attachment. when i open the attachment i am able to see only 7 but not 007.
I want to retain the 00.
Any ideas.
Thanks.