Error in viewing hash data
Moderators: chulett, rschirm, roy
Error in viewing hash data
I am having problem in veiwing the Hash file. I am creating the hash file by loading the data into the hash file. When I try to view the data by VIEW DATA on the hash file stage I am getting following error.
1: some memory reference error
2: "Error calling subroutine: DSD.BROWSE(Action=3); check datastage is set up correctly in project PROJECT Name"
Then another pop up with
3: "Data source is empty"
But the job is compiled successfully and ran fine. I could see the number or rows passed to hash file but I could not view the data nor I could use the hash file for other purposes.
Any help in this regard is greatly appreciated.
Thanks,
DW123
1: some memory reference error
2: "Error calling subroutine: DSD.BROWSE(Action=3); check datastage is set up correctly in project PROJECT Name"
Then another pop up with
3: "Data source is empty"
But the job is compiled successfully and ran fine. I could see the number or rows passed to hash file but I could not view the data nor I could use the hash file for other purposes.
Any help in this regard is greatly appreciated.
Thanks,
DW123
-
- Participant
- Posts: 64
- Joined: Fri Jul 16, 2004 7:53 am
Hi,
Data source is empty mean that data has not moved to hash file.. have you checked source.. what are you using in as a source?.. is it a file or database connection.. if file have you properly specified the delimiter..check the source data first by looking at the view data option.. which first ensures that source is fine.. then run the job.. it should definetly work.. hope this help...
Data source is empty mean that data has not moved to hash file.. have you checked source.. what are you using in as a source?.. is it a file or database connection.. if file have you properly specified the delimiter..check the source data first by looking at the view data option.. which first ensures that source is fine.. then run the job.. it should definetly work.. hope this help...
Hello DW123,
did you run the job with the same User-Id that you are using for your view-data in the designer? If you are, try going into the Administrator into that project and doing a command "COUNT <hashfilename>" to see if you can execute the command directly within DataStage. Is the memory fault on the client or the server side?
did you run the job with the same User-Id that you are using for your view-data in the designer? If you are, try going into the Administrator into that project and doing a command "COUNT <hashfilename>" to see if you can execute the command directly within DataStage. Is the memory fault on the client or the server side?
-
- Participant
- Posts: 3337
- Joined: Mon Jan 17, 2005 4:49 am
- Location: United Kingdom
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Sending data to a hashed file does not, of itself, create the hashed file. Do you have the "Create" check box checked in the Hashed File stage? If so, there should be a message in the job log indicating that the hashed file was created. Can you see such an entry?
A hashed file must have at least one key column. This will be OK in your job, because the compiler detects lack of a key column.
Unfortunately "some memory error" is too vague to offer a diagnosis, we need the exact error message, not least to determine whether the error is being generated on the server side or on the client side.
The message from DSD.Browse is from the helper subroutine for the data browser, and indicates that it is having a problem in the repository. For example it may not be able to resolve the location of the hashed file, or one of the methods exposed by the Hashed File stage.
You could try re-indexing the repository (DS.REINDEX ALL) but the problem here may not be in one of the indexed files.
Also execute the command UVFIXFILE VOC (either from the Administrator client command window or from within dssh on the server) and let us know whether any problem was reported.
Finally, check the job log for any other error message, and post the exact error message(s) to allow us to diagnose more accurately.
A hashed file must have at least one key column. This will be OK in your job, because the compiler detects lack of a key column.
Unfortunately "some memory error" is too vague to offer a diagnosis, we need the exact error message, not least to determine whether the error is being generated on the server side or on the client side.
The message from DSD.Browse is from the helper subroutine for the data browser, and indicates that it is having a problem in the repository. For example it may not be able to resolve the location of the hashed file, or one of the methods exposed by the Hashed File stage.
You could try re-indexing the repository (DS.REINDEX ALL) but the problem here may not be in one of the indexed files.
Also execute the command UVFIXFILE VOC (either from the Administrator client command window or from within dssh on the server) and let us know whether any problem was reported.
Finally, check the job log for any other error message, and post the exact error message(s) to allow us to diagnose more accurately.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Thanks for the reply,ArndW wrote:Hello DW123,
did you run the job with the same User-Id that you are using for your view-data in the designer? If you are, try going into the Administrator into that project and doing a command "COUNT <hashfilename>" to see if you can execute the command directly within DataStage. Is the memory fault on the client or the server side?
I have executed a command COUNT<Hashfile> and it executed successfully and gave no of row which I have loaded into hashfile. Looks like it is a memory problem. When I try to view the data thru hash file stage I get the first message as " dsapi_slave.exr - application error ..... The instruction at "0x77f95467" reference memory at "Some Number". THe memory could not ne read Click on OK to terminate the program".
It looks like the porblem is only with reading the data. Since I could write the data into hashfile successfully and there is no message of errors in the log(Log says that the job finihsed successfully). If nothing works out I think we need to reinstall the client.
if you have any pointer, please let me know the
Thanks,
Um... sure it does.ray.wurlod wrote:Sending data to a hashed file does not, of itself, create the hashed file. Do you have the "Create" check box checked in the Hashed File stage?
![Confused :?](./images/smilies/icon_confused.gif)
Of course, if you don't choose that option, then you get all of the default creation parameters - Type 30, Minimum Modulus 1, Group size 1, etc, etc - the same thing you get if you just check the option and don't change any of the values under the 'Options' button.
It's a little counter-intuitive, yes, but you check the 'Create File' option if you want the ability to have it deleted and recreated (not just cleared) each job run or if you need to override the defaults. You don't need to check it to simply get the hash file created, in spite of its name.
![Wink :wink:](./images/smilies/icon_wink.gif)
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
As I understood it, only validating the job or choosing the Create check box cause a hashed file to be created. That knowledge is from 5.2, however; have not verified it since.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.