Page 1 of 1

How can I reject fields that exceed length?

Posted: Wed Nov 16, 2016 5:10 am
by s_avneet
I have a column(varchar) of length 20, if i pass a string of 25 characters, it trims the record but it lets it pass through with a warning.

How do i reject the record, if the length is greater than specified? Is there a global setting or do i need to apply a check field by field??

Posted: Wed Nov 16, 2016 8:50 am
by asorrell
Transformers assume they are supposed to transform the data (hence the name). In your example, it assumes it is supposed to truncate the data since you are moving a larger string into a smaller string. In any other stage you'd get a warning. But in a transformer, it auto-truncates.

If you want to reject a record based on field validation, here's two ways to do it.

First way assumes there is the only string being checked. You can add a constraint on your main output link to restrict it to "good" records that meet the length requirement. Bad records can be routed to another link via a constraint that only allows records with fields over that limit.

If you want to reject based on several field or condition checks, you can add stage variables to "flag" bad records. For each check set it's respective variable to '0' (false) if it passes or '1' (true) if it fails. You can add constraints to your output links that check all of the flags at once, and pass/reject records as appropriate. You can even use Boolean logic since the variables are ones and zeroes.

If you can even add a "reason" field to the reject link that checks the stage variables and adds a message to the output saying why the record was rejected.

Posted: Thu Nov 17, 2016 8:19 am
by s_avneet
Ok got your point.

Does it also display the same behavior with the number of records?

So i am creating a file with 10 records comma separated, i am passing 12 records, it does not fail, just truncates the record..

I thought DS has strict validation rules.

Posted: Thu Nov 17, 2016 9:04 am
by UCDI
s_avneet wrote:So i am creating a file with 10 records comma separated, i am passing 12 records, it does not fail, just truncates the record.
Can you explain this better? You have the # of records that you have. If the file has 10, then 10 you have. The only way to get 12 is to apply some sort of logic that makes 2 more somehow, which can be from join logic or from splitting records on purpose or a funnel or whatever similar things.
Records are not truncated generally. Text fields can be truncated. Are you confusing the word record and the word column here?

Posted: Thu Nov 17, 2016 9:04 am
by asorrell
I believe you mean fields, not records. DataStage would not eliminate records automatically.

If you only have 10 fields (columns) defined and you have twelve in your data, then DataStage is assuming that you know what you are doing and it writes out only the 10 columns you told it to write.

If you want it to write out 12 columns then either define 12 columns (it will still write out the 10 column rows) or use Run-Time Column Propagation (RCP) which will pass all data elements through the entire job, whether they are defined or not. There are numerous other impacts to using RCP, many of which are detailed on posts here on DSXChange.