I am experiencing a strange problem like I am writing data into a dataset .It is having more than 80 Columns in it .When I am attempting to write in to that the datastage job is failing with out giving any error messages .
Is there any limitation on number of columns to writing into datasets ??
The is no column limitation close to 80 for DataSets. If you change your job to write to a sequential file (pathed to /dev/null) does it run correctly? Where are the data files written to (specified in the APT_CONFIG_FILE) and is that/are those disk(s) filling up?