Reuseable Jobs with EBCDIC data
Posted: Mon Aug 22, 2005 8:59 am
Suppose I wish to create a job that will take input filename for an EBCDIC source as parameter and convert this to ASCII format. I wish to call this job with many different source formats. In each case I wish the input file to transform the datatypes to most appropriate non-mainframe datatype. For example, I would like COMP-3 packed decimals decoded to regular displayable ascii numerics, EBCDIC string converted to ASCII string etc.
1. Can EBCDIC be converted to ascii within datastage without the CFF component? I don't want to use this component, since it doesn't seem support much parameterization (schema file etc).
2. Can conversion between EBCDIC and ASCII be done without explicitly writing a derivation for each mapping. Ie. If I provide an EBCDIC description in the source schema file, and an ASCII description in the target schema will DS automatically convert? I have used ETL tools in the past that have done the transformation implicitely if your source was described EBCDIC and your target ASCII.
3. Can question 2 above be done with RCP, and have the data convert without any mention of the fields in the transform?
Unfortunately, I do not have a Parallel environment to test this out at this time.
Thanks.
1. Can EBCDIC be converted to ascii within datastage without the CFF component? I don't want to use this component, since it doesn't seem support much parameterization (schema file etc).
2. Can conversion between EBCDIC and ASCII be done without explicitly writing a derivation for each mapping. Ie. If I provide an EBCDIC description in the source schema file, and an ASCII description in the target schema will DS automatically convert? I have used ETL tools in the past that have done the transformation implicitely if your source was described EBCDIC and your target ASCII.
3. Can question 2 above be done with RCP, and have the data convert without any mention of the fields in the transform?
Unfortunately, I do not have a Parallel environment to test this out at this time.
Thanks.