Page 1 of 1

Decryption Issue

Posted: Fri Feb 13, 2015 10:39 am
by Nagac
Hi

I need to decrypt the source files before processing into database using datastage. For this I have batch script which will decrypt the files and make them ready to process.

I am calling script via Command Execute Activity in Datastage, which giving error "No Secret Key". Same Command is being executed successfully when i run it manually on Server at Command Prompt using the same User ID as Datastage Execution.

For testing purpose i have created separate Sequence with the below command and executed:

Code: Select all

GPG --batch --passphrase-file I:\app\keyphrase\PassPhrase.txt --output J:\dw\out\01_20_200_0000_02012015_003005.csv --decrypt J:\dw\in\01_20_200_0000_02012015_003005.csv.gpg
Output:
Output from command ====>
gpg: encrypted with RSA key, ID 00000000
gpg: decryption failed: No secret key

Posted: Fri Feb 13, 2015 2:20 pm
by qt_ky
It is possible that your job actually executes under some other user account (you could add some sort of whoami command to verify) and/or likely that the run time execution environment has different settings that your command line session. Start doing some comparisons there.

Posted: Fri Feb 13, 2015 2:38 pm
by ray.wurlod
Do you get a different result when running the job from dsjob versus running the job from a DataStage client?

Posted: Fri Feb 13, 2015 4:39 pm
by Nagac
Hi Ray

I don't see any difference in result of dsjob and client tool execution.

Hi qt_ky
I have checked the user id by using (id) in before routine and found the actual user and Command level the given command works fine.

What kind of comparisons we can do here??

Posted: Fri Feb 13, 2015 5:14 pm
by PaulVL
Does the command line version work when your current working directory is the project directory (as would it would be in a command execute activity)?

Echo your env settings before doing that GBG call. Compare that with your shell settings when running manually. Maybe $PATH or some other odd setting is messing with you.


Is HOME environment variable the same for both runs?

Posted: Fri Feb 13, 2015 7:07 pm
by Nagac
Yes Paul, Command worked from the Project home directory.

I could see some differences in Environment settings
Job Env:
TEMP=C:\Windows\system32\config\systemprofile\AppData\Local\Temp
TERM=nutc
TERMCAP=C:\PROGRA~2\MKSTOO~1\etc\termcap
TERMINFO=C:\PROGRA~2\MKSTOO~1\usr\lib\terminfo
TMP=C:\Windows\system32\config\systemprofile\AppData\Local\Temp
UNIVERSE_CONTROLLING_TERM=1
UNIVERSE_PARENT_PROCESS=11192
USERDOMAIN=GLOBALEXT
USERNAME=<SERVERNAME>$
USERPROFILE=C:\Windows\system32\config\systemprofile
Command Line Env:
TEMP=C:\Users\DWDATA~1.SV\AppData\Local\Temp
TERM=nutc
TERMCAP=C:\PROGRA~2\MKSTOO~1\etc\termcap
TERMINFO=C:\PROGRA~2\MKSTOO~1\usr\lib\terminfo
TMP=C:\Users\DWDATA~1.SV\AppData\Local\Temp
USERDNSDOMAIN=GLOBAL.COMP.NET
USERDOMAIN=GLOBAL
USERNAME=dsuser
USERPROFILE=C:\Users\dsuser
Where do we change the domain in Datastage??

Posted: Mon Feb 16, 2015 6:14 am
by Nagac
Thanks Guys,

I've fixed this issue by changing the User ID which runs the dsrpcd Service.

Thanks

Posted: Mon Feb 16, 2015 8:55 am
by chulett
Can you explain what exactly that means? Changed it from what to what?

Posted: Thu Feb 19, 2015 4:15 pm
by Nagac
dsrpc was running by using local system account, i have changed to specific user id which have full access and i have configured gpg keys for that user.

It worked fine when i manually run a datastage Jobs. But it throws same error when i schedule datastage Job. i have configured same user in administrator schedule tab.

Do i need to configure any other place for schedule?