Page 1 of 1

Identify userid which used to execute jobs

Posted: Fri Oct 17, 2014 1:47 am
by Nagac
Hi

Is there any way to find the user name which is used to execute the datastage job/sequence???

Because i am trying to execute batch script through datastage sequence. But it was successful and there were no error. I thought it might be access related issue. The script is working when i execute on cmd mode and script is located on server where datastage is setup

Re: Identify userid which used to execute jobs

Posted: Fri Oct 17, 2014 5:53 am
by chulett
Nagac wrote:But it was successful and there were no error.
I'm assuming you forgot to add "however, it doesn't do what it was supposed to do" yes? :wink:

Don't worry about the userid yet. First suggestion - don't make any assumptions about the environment the script will run in. When you run it from the command line, everything is probably set up and you are in 'the right place' but that won't be the case from inside a job, necessarily. If you are using relative paths, make sure you understand the CWD will be the project the executing job lives in. Source the dsenv file in your script if you are on UNIX. Use full pathnames or explicitly 'cd' to where you need to be to do whatever it is you are doing. That should help tremendously. If it doesn't, consider posting your script here so we can provide much more specific help.

Posted: Tue Dec 16, 2014 7:27 am
by Nagac
Thanks Craig,

Below is my batch script which i am running through the Datastage Job which give error. I have configured Keys for dsadmin user however i have got error saying "Unable to locate credentials".

This script works as expected when i ran manually on server.

Code: Select all

@echo
SETLOCAL enabledelayedexpansion
::########################################################################################################
::Script Name: INTL_CRM_S3Bucket_Inbound.bat
::Purpose: This script will move all files in S3 Bucket to inbound directory. This will remove all files in inbound directory before being the moved into it.
::Also Creates log files in log directory

::########################################################################################################
::Date						Author						Comments
:: Usage: INTL_CRM_S3Bucket_Inbound.bat <Files_BasePath> <Project_Plus_BasePath> <Region> <Feed_Type>
:: Example: INTL_CRM_S3Bucket_Inbound.bat K:\Transfer\dw K:\Project_Plus\app\dw europe fin


::########################################################################################################

:: ############### Assigning the Directories to Variables######################
SET "HOME_PATH=%1"
SET "HOME_PATH1=%2"
SET "REGION=%3"
SET "DOMAIN=%4"
SET "INBD_PATH=%HOME_PATH%\%REGION%\%DOMAIN%\inbound"
SET "DECR_PATH=%HOME_PATH%\%REGION%\%DOMAIN%\decryption"
SET "LOG_PATH=%HOME_PATH%\%REGION%\%DOMAIN%\log"
SET "SCRIPT_PATH=%HOME_PATH1%\%REGION%\scripts\batch"
SET "FILES_PATH=%HOME_PATH%\%REGION%\%DOMAIN%\file\"
SET "PHRASE_PATH=%HOME_PATH1%\keyphrase"
SET "S3_BUCKET=s3://ins_cbi-inbound/%REGION%/%DOMAIN%"
:: ################ Creating Log File Name #######################################
SET DDATE=%date:~10%%date:~4,2%%date:~7,2%
SET DTIME=%time: =0%
SET DTIME=%DTIME:~0,2%%DTIME:~3,2%%DTIME:~6,2%
SET LOGFNAME=%LOG_PATH%\%~n0_%DDATE%_%DTIME%.log
:: ############### Start Actual Process ########################################
DEL %INBD_PATH%\* /Q
IF %ERRORLEVEL% neq 0 (
	ECHO %date:~4% %time% PreDeletion: Deletion UnSuccessful of Files in %INBD_PATH% >> %LOGFNAME%
	EXIT 1
	)ELSE (
	ECHO  %date:~4% %time% PreDeletion: Deletion Successful of Files in %INBD_PATH% >> %LOGFNAME%
	)

:: ################ Moving the files from S3 Bucket to Inbound Folder ######################
aws s3 mv %S3_BUCKET% %INBD_PATH% --recursive --acl public-read
IF %ERRORLEVEL% neq 0 (
	ECHO %date:~4% %time% Move: Moving the files UnSuccessful from %S3_BUCKET% to %INBD_PATH% >> %LOGFNAME%
	EXIT 1
	)ELSE (
	ECHO  %date:~4% %time% Move: Moving the files Successful from %S3_BUCKET% to %INBD_PATH% >> %LOGFNAME%
	)

REM ################ Checking Files Exists in S3 Inbound Bucket ######################
SET COUNT=0
FOR %%A in (%INBD_PATH%*.gz) DO SET /a COUNT+=1
IF !COUNT! LEQ 0 (
	ECHO %date:~4% %time% Files Exists: There are no Files Exists in S3 Inbound Bucket %S3_BUCKET% >> %LOGFNAME%
	EXIT 1
	)ELSE (
	ECHO %date:~4% %time% Files Exists: There are %COUNT% Files Exists in S3 Inbound Bucket %S3_BUCKET% >> %LOGFNAME%
	)

Posted: Tue Dec 16, 2014 6:00 pm
by pandeesh
Nagac wrote: i have got error saying "Unable to locate credentials".

This script works as expected when i ran manually on server.
It looks like the variable interpolation problem in the below part:

Code: Select all

s3://ins_cbi-inbound/%REGION%/%DOMAIN%
Please check whether escaping // helps. I mean:

Code: Select all

s3:\/\/ins_cbi-inbound\/%REGION%\/%DOMAIN%

Posted: Thu Dec 18, 2014 6:05 am
by Nagac
Hi Pandeesh,

The script works perfectly when we execute manually in Terminal.

So there won't be any issues with these Slashes

Posted: Thu Dec 18, 2014 6:26 am
by priyadarshikunal
there might be some thing being set in your .profile. try sourcing your .profile in this script or check what is there in .profile of the user from which you are able to run this script without problems.

Posted: Thu Dec 18, 2014 3:06 pm
by ray.wurlod
Just add a before-job or after-job subroutine that invokes ExecDOS to invoke any command to identify the user (for example id command).

Posted: Fri Dec 19, 2014 10:58 am
by Nagac
I tried in Parallel Job which has given me the user id. Then i've configured AWS Credentials for that particular User. and ran the script through the same user it worked fine.

But when i invoke that Script in Datastage Sequence it says "Unable to locate credentials"