I have created one Job with teradata connector stage to extract data and write it to sequencer. I could run this from datastage designer but when I try to run the osh file related to that job like below its giving me errors like. > osh -f /data/xxxx/sssdss/RT_SC2024/OshScript.osh PATH search failure...
Could you please tell me Is there any way to see the C COMPLETE source code file which implemetes the runLocally function etc., created for a transformer stage of any job in project folder.
How to retain that C file.
I could see only .trx and .o files.
Thanks Ray for your reply. No. Same configuration file used to write and read the dataset. One job writes to dataset and another reads from it . these 2 jobs running faster in one environment but in other environment first job that writes to data sets runs for same time as other environment but read...
Hi, I have designed one job with Dataset ---> copy----------->copy In One environment it takes 4 mins to read dataset with 20 million records, In other environment it takes 30+ minutes to finish. I am not sure whats going wrong. In both the environment OS is same DS version is same. 1. the DS file g...
Hi I am trying to handle null in build op logic for example outRec.empno_setnull(); but I am getting below error while compiling. empno_setnull is not a member of APT_Bop_output0Accessors. 1 Error(s) detected. I have seen setting null values like this in the forum but its not working for me. I tried...
Thanks ArndW for your reply. I created a simple file in mainframe with rec length 10 using ISPF editor assuming character HIGH VALUES as FF ( correct me if I am wrong) did ftp in binary mode. infld1 pix x(1) infld2 pix x(1) # high val field infld3 pix x(7) infld4 pix x(1) ..1111111a ---> actual data...
Hi All, Please help me handling HIGH values in Complex flat file stage. I am reading a cobol file using complex flat file stage and writing to sequential file. one field got defined as PIC X and got HIGH values in the source mainframe file. I am getting Junk () as out put in sequential file for that...
Thanks for your replys, I tried with correct settings in startup.apt and by renaming it but there is no change in the error messages. except startup.apt executed. The messages are after skipping the startup.apt. startup.apt just unsets DB2INST env var if host name is not ETL Server. I dont think so ...
I executed the job with ExecSH
Result of id :
stgtest..BeforeJob (ExecSH): Executed command: id
*** Output from command was: ***
uid=11133(dsadm) gid=3009(dstage) groups=3009(dstage),3178(ETL_ADMIN)
Hi, I am trying to configure DB2 EE stage between ETL server( 32 bit linux OS) and remote DB2 Server( 64 bit linux OS). Opened all required ports in the firewall between these servers(22,50100 etc.,) We have all PXEngine,DSEngine,DSComponent,Configuration etc., folders exactly same on both server wi...
Hi, I am trying to generate xml file for below fields : Partial definition: record {record_length=259, delim=none, quote=none, binary, ebcdic, native_endian, round=round_inf, nofix_zero} ( ............................ PSD_BLF_UNIT_PRICE_PCT:decimal[8,6] {default=0,packed}; FILLER_2:subrec {redefines...
In Version 8.0. I want to create Separate DSX File each data stage component.
Please let me know How can I create Separate DSX File each data stage component in single export.
If I select multiple jobs in export option all jobs combined into single DSX file.