data is getting rejected when loading to file.

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Specify it as char in you job itself. That should take care of it.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ambasta
Participant
Posts: 93
Joined: Thu Jan 19, 2006 10:29 pm
Location: Bangalore

Post by ambasta »

DSguru2B wrote:Specify it as char in you job itself. That should take care of it.

Thanks a lot for your kind suggession,but i am having source data as decimal and decimal value is coming for that column.
ambasta
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

I understand. But if you change the target to Char. It can handle decimal values. I am not too sure, but give it a shot.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ambasta
Participant
Posts: 93
Joined: Thu Jan 19, 2006 10:29 pm
Location: Bangalore

Post by ambasta »

is there any option available in sequential file,such that it will automatically populate those many no of blank spaces!!!!
ambasta
ambasta
Participant
Posts: 93
Joined: Thu Jan 19, 2006 10:29 pm
Location: Bangalore

Post by ambasta »

DSguru2B wrote:I understand. But if you change the target to Char. It can handle decimal values. I am not too sure, but give it a shot.
Target is already CHAR and it is Fixed Width file.
ambasta
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

what is the length of you source and target?
You can stick a transformer in between and check for ISNULL and then explicity concatenate spaces.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Nageshsunkoji
Participant
Posts: 222
Joined: Tue Aug 30, 2005 2:07 am
Location: pune
Contact:

Post by Nageshsunkoji »

ambasta wrote:
DSguru2B wrote:I understand. But if you change the target to Char. It can handle decimal values. I am not too sure, but give it a shot.
Target is already CHAR and it is Fixed Width file.
Hi Kunal,

You can acheive this by putting spaces in the Null field value option of the sequential file. Just, double click on the field and select the option null field value and put spaces equivalent to width of u r column. i.e if length of u r filed is 10, Put that many spaces in that null field value. I think it will solve u r problem.
NageshSunkoji

If you know anything SHARE it.............
If you Don't know anything LEARN it...............
ambasta
Participant
Posts: 93
Joined: Thu Jan 19, 2006 10:29 pm
Location: Bangalore

Post by ambasta »

Thanks GURU,
But this constrain is for around 100 fields.conversion like
Timestamp to Char
Varchar to Char
Decimal to Char etc are there.
ambasta
ukyrvd
Premium Member
Premium Member
Posts: 73
Joined: Thu Feb 10, 2005 10:59 am

Post by ukyrvd »

ambasta wrote:
DSguru2B wrote:I understand. But if you change the target to Char. It can handle decimal values. I am not too sure, but give it a shot.
Target is already CHAR and it is Fixed Width file.
hmm!! in that case I think you are stuck with implementing nulltovalue logic before writing to the file!! I havent seen any general toplevel option that sets it automatically.

if its not fixed witdth file .. when you are specifying default value in column properties, VARCHAR doesnt require you to specify the exact length string .. but CHAR still requires that!!
thank you
- prasad
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Actually this is strange, a CHAR column with length 10 (for eg) will always result in 10 bytes when written to a fixed width file, regardless of the datatype of the source.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ambasta
Participant
Posts: 93
Joined: Thu Jan 19, 2006 10:29 pm
Location: Bangalore

Post by ambasta »

The problem is... data is getting rejected at transformer stage.and i am getting warning message as ...[APT_CombinedOperatorController(1),0: Field 'PROCESS_DATETIME' from input dataset '0' is NULL. Record dropped.]
ambasta
meena
Participant
Posts: 430
Joined: Tue Sep 13, 2005 12:17 pm

Post by meena »

Hi,
Your columns are getting dropped for NULL'S. I think it is better to use "nulltovalue" function for the column or try to replace the column nulls to any value.
The problem is... data is getting rejected at transformer stage.and i am getting warning message as ...[APT_CombinedOperatorController(1),0: Field 'PROCESS_DATETIME' from input dataset '0' is NULL. Record dropped.]
ukyrvd
Premium Member
Premium Member
Posts: 73
Joined: Thu Feb 10, 2005 10:59 am

Post by ukyrvd »

ambasta wrote:The problem is... data is getting rejected at transformer stage.and i am getting warning message as ...[APT_CombinedOperatorController(1),0: Field 'PROCESS_DATETIME' from input dataset '0' is NULL. Record dropped.]
this is different problem .. and covered well on DSX!!

this happens if you are doing any string operations without checking for NULL value ..

check this out:
viewtopic.php?t=100341
thank you
- prasad
ambasta
Participant
Posts: 93
Joined: Thu Jan 19, 2006 10:29 pm
Location: Bangalore

Post by ambasta »

I think there is no other way out .... :( .i need to do it in the transformer using IsNull function for every column separetely.
ambasta
samsuf2002
Premium Member
Premium Member
Posts: 397
Joined: Wed Apr 12, 2006 2:28 pm
Location: Tennesse

Post by samsuf2002 »

in ur sourse file check the nulable col is it defined correctly,i think if a col is nullable and if u specify it as not null then it get dropped i am not sure some body correct me if i am wrong.
hi sam here
Post Reply