Null handling in PX Sequential files
Moderators: chulett, rschirm, roy
Null handling in PX Sequential files
How do I read a Delimited sequiential file (having Null values in a couple fields ) with out dropping any of the records in PX.
Sample input file format
12|32|43
32||23
Here FIeld2/Row2 is a null. When std PX Sequential file stg is used to read this, second record is getting dropped.
Any ideas are appreciated..
Sample input file format
12|32|43
32||23
Here FIeld2/Row2 is a null. When std PX Sequential file stg is used to read this, second record is getting dropped.
Any ideas are appreciated..
Hi
In the Sequential File Properties go to the format tab and Fill in the Null Field Value with empty string or with some default value. Also make sure that Nullable columns are handled properly in your transformation
Thanks
In the Sequential File Properties go to the format tab and Fill in the Null Field Value with empty string or with some default value. Also make sure that Nullable columns are handled properly in your transformation
Thanks
Regards
Siva
Listening to the Learned
"The most precious wealth is the wealth acquired by the ear Indeed, of all wealth that wealth is the crown." - Thirukural By Thiruvalluvar
Siva
Listening to the Learned
"The most precious wealth is the wealth acquired by the ear Indeed, of all wealth that wealth is the crown." - Thirukural By Thiruvalluvar
Guys,
Thx for your ideas.
I have a Nothing fields like two delimiters coming side by side with nothing between them
12|32|43
32||23
Second row in the example has the scenario Iam talking about.
When I read such a row using PX sequential file, it gets dropped in reading itself. It doesn't come even till next stage. So How do I read this row without dropping it.
Thx for your ideas.
I have a Nothing fields like two delimiters coming side by side with nothing between them
12|32|43
32||23
Second row in the example has the scenario Iam talking about.
When I read such a row using PX sequential file, it gets dropped in reading itself. It doesn't come even till next stage. So How do I read this row without dropping it.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Siva's advice relates to server jobs. The way you handle zero-length strings in parallel jobs is different; but the representation of null property in the Sequential File stage is one of them. Once you can get "" into your job, you then have to monitor it right through the job design and either convert it back to a null, or supply a replacement default value, if either of those actions is appropriate.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Agree with Siva's suggesstion. We implemented the same method to handle null value, which is specify the default value in the null string column when you read the seq file and transform back to NULL in the transformer stage before writing the record into a dataset.
This would resolve the issue.
Regards
Saravanan
This would resolve the issue.
Regards
Saravanan
-
- Participant
- Posts: 34
- Joined: Thu Jul 29, 2004 1:03 pm
Hi VJ,
Above suggestions hold good if you have used the Null field property while writing to the file.
Otherwise you will not be able to read this record.
Actually Nulls do exist in Sequential files but these values are not interpreted correctly. To avoid any such prcarious situation, use Null field value property while writing to the file.
Then use same Null Filed value property to read the file. This will ensure that null vallues are interpretted correctly in DataStage jobs.
Above suggestions hold good if you have used the Null field property while writing to the file.
Otherwise you will not be able to read this record.
Actually Nulls do exist in Sequential files but these values are not interpreted correctly. To avoid any such prcarious situation, use Null field value property while writing to the file.
Then use same Null Filed value property to read the file. This will ensure that null vallues are interpretted correctly in DataStage jobs.
Happy DataStaging
I just completed a PX seminar. I had this same problem in an exercise. If your reading a row like:
a,,b,c,d
this row will be dropped because of the null in F2.
To solve this problem you must set the Null String property. You must pad this property with characters matching the length defined for the field that is null.
ie. if F2 is defined as char(3) you must set the Null String property to '___ ' or 000 etc.
btw: if your source file was fixed length there is an enviornment variable for Null handling which would accomplish the same thing (FIXED ONLY)
This approach worked for me.
Good Luck![Smile :-)](./images/smilies/icon_smile.gif)
a,,b,c,d
this row will be dropped because of the null in F2.
To solve this problem you must set the Null String property. You must pad this property with characters matching the length defined for the field that is null.
ie. if F2 is defined as char(3) you must set the Null String property to '___ ' or 000 etc.
btw: if your source file was fixed length there is an enviornment variable for Null handling which would accomplish the same thing (FIXED ONLY)
This approach worked for me.
Good Luck
![Smile :-)](./images/smilies/icon_smile.gif)
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Clarification: That environment variable has the name APT_IMPEXP_ALLOW_ZERO_LENGTH_FIXED_NULL (whew!) and applies only to the Import and Export operators in the framework.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
But what would qualify as a 'null' in a fixed width sequential file? Any inner column that might qualify would have to be all spaces to maintain the sizing of the record. Are we talking about that or trailing (i.e. missing) columns from the end of the record? ![Confused :?](./images/smilies/icon_confused.gif)
![Confused :?](./images/smilies/icon_confused.gif)
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers