Data in Sequential File Not Proper
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
Data in Sequential File Not Proper
Hi All
I don't have a Data Stage Instance to try this right now but can anyone please tell me what happens in the following scenario.
seq file ---> Transformer---> Oracle Table.
The Seq file is having some records which are pipe delimited and matching the meta data which is specified in the Schema definition Tab.
Suppose i have 100 records in my Seq file and 51st record is not matching with the meta data and coming with a , delimiter.
Will my DS job run and if it what happens to my 51st record.
Thanks
Pavan
I don't have a Data Stage Instance to try this right now but can anyone please tell me what happens in the following scenario.
seq file ---> Transformer---> Oracle Table.
The Seq file is having some records which are pipe delimited and matching the meta data which is specified in the Schema definition Tab.
Suppose i have 100 records in my Seq file and 51st record is not matching with the meta data and coming with a , delimiter.
Will my DS job run and if it what happens to my 51st record.
Thanks
Pavan
By default the job will stop at the 51st record with an error, since all of the sequential line's data will go into the first column and the second column will trigger an error. You can change this behaviour, if you wish to do so, by setting the missing column attributes.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
It aborts as soon as it hits the error, be it the 1st or the n-millionth.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India
Hi
Looks like there seems to be contradicting answers here. As for ArndW the Job aborts as soon as it hits a bad record.
Sorry if i am understanding it wrong but Chulett seems to say that the Job will run logging an warning for every bad record. And the Job will only abort only if the job warnings limit exceed the specified value.
Which of the above is true
Thanks
Pavan
Looks like there seems to be contradicting answers here. As for ArndW the Job aborts as soon as it hits a bad record.
Sorry if i am understanding it wrong but Chulett seems to say that the Job will run logging an warning for every bad record. And the Job will only abort only if the job warnings limit exceed the specified value.
Which of the above is true
Thanks
Pavan
If you use the default settings in the sequential file read, the job will fail at the first bad record.
Craig - I just tested it to make sure; the warning limit doesn't apply since the first row read with incorrect column counts causes and errors with something like "row 3, column two, required column missing". This is because the default setting for a missing column is "Error". If that is changed to one of the warning options, then the runtime warning setting would kick in.
Craig - I just tested it to make sure; the warning limit doesn't apply since the first row read with incorrect column counts causes and errors with something like "row 3, column two, required column missing". This is because the default setting for a missing column is "Error". If that is changed to one of the warning options, then the runtime warning setting would kick in.
Last edited by ArndW on Fri Feb 16, 2007 1:11 pm, edited 1 time in total.
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
True... it really depends on a number of different things. In this particular case, the 'Missing Column' attribute is key.
Sorry, I'd just gone over this with our production support folks for one particular job so it was fresh in my head, but the circumstances there were a little different. It was a 'too many columns in record' problem in a fixed-width file and it logged just a warning and kept on going. They ignored it (job control caught it) and forced everthing else to run. Now they get to run it all again.
Sorry, I'd just gone over this with our production support folks for one particular job so it was fresh in my head, but the circumstances there were a little different. It was a 'too many columns in record' problem in a fixed-width file and it logged just a warning and kept on going. They ignored it (job control caught it) and forced everthing else to run. Now they get to run it all again.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Premium Member
- Posts: 133
- Joined: Tue Nov 23, 2004 11:24 pm
- Location: India