Page 1 of 1

Reading of csv file using Seq File Stage

Posted: Wed Nov 10, 2010 8:01 am
by krisp321
Hi I want to read .csv file data using seq. file stage, but I have an issue here

Data is as follows (presume all are varchar datatypes)

100,200,300,A,"34,00","64,000","0,23"
200,200,500,B,"22,000","11,0"12,00"

Here is the issue, because the values within double quotes contains commas, it is treating as delimiter value, though it is single value


Please suggest me how to handle this
Final Delimiter=end

Field Level
Delimiter=comma
Quote=double


Thanks in advance

Posted: Wed Nov 10, 2010 8:45 am
by anbu
If you define Quote as double, then you are fine. Were you able to run your successfully?

Posted: Wed Nov 10, 2010 8:49 am
by krisp321
anbu wrote:If you define Quote as double, then you are fine. Were you able to run your successfully?
Hi
Quote=double also I have done it before which I already mentioned in my first post.

Its not going to work, because within quotes , comma is there, which is getting treated as delimiter and my values is getting truncated

Looking for some better solution

Posted: Wed Nov 10, 2010 9:14 am
by anbu
Your quote setting should work. Truncation problem is due to your length defined for that field. Increase your length, then you should be able to read the data.

Posted: Wed Nov 10, 2010 9:26 am
by krisp321
anbu wrote:Your quote setting should work. Truncation problem is due to your length defined for that field. Increase your length, then you should be able to read the data.
Hi
thanks for ur reply. its not the problem with length, its varchar with no length specification.

You can check from ur end placing the same data in my first posting
it wont work if there is comma within double quotes

Really appreciate if someone can give an affective solution.
I am not expecting simple new datastage level solutions

Re: Reading of csv file using Seq File Stage

Posted: Wed Nov 10, 2010 9:29 am
by krisp321
Please read my posting clearly before suggestion
I am saying comma within quotes is getting treated as delimiter which should not..THATS MY ONLY CONCERN
PLEASE DONT SUGGEST, LENGHT PROBLEM OR SOME TRUNCATION PROBLEM.
krisp321 wrote:Hi I want to read .csv file data using seq. file stage, but I have an issue here

Data is as follows (presume all are varchar datatypes)

100,200,300,A,"34,00","64,000","0,23"
200,200,500,B,"22,000","11,0"12,00"

Here is the issue, because the values within double quotes contains commas, it is treating as delimiter value, though it is single value


Please suggest me how to handle this
Final Delimiter=end

Field Level
Delimiter=comma
Quote=double


Thanks in advance

Posted: Wed Nov 10, 2010 10:35 am
by anbu
I just noticed problem in your data. Correct your data
100,200,300,A,"34,00","64,000","0,23"
200,200,500,B,"22,000","11,0","12,00"

Re: Reading of csv file using Seq File Stage

Posted: Wed Nov 10, 2010 11:32 am
by nitkuar
krisp321 wrote:Please read my posting clearly before suggestion
I am saying comma within quotes is getting treated as delimiter which should not..THATS MY ONLY CONCERN
PLEASE DONT SUGGEST, LENGHT PROBLEM OR SOME TRUNCATION PROBLEM.
this is not the way of asking for favour... this is a professional forum, please think before you post something this. :shock:

Posted: Wed Nov 10, 2010 3:37 pm
by ray.wurlod
See if a server Sequential File stage - either in a server job or in a server shared container - can read this file properly. You can use a server shared container in a parallel job.

Posted: Wed Nov 10, 2010 11:15 pm
by krisp321
anbu wrote:I just noticed problem in your data. Correct your data
100,200,300,A,"34,00","64,000","0,23"
200,200,500,B,"22,000","11,0","12,00"
Sorry, thats my typing error, data in file is correct with double quotes and having a comma in it

Thanks

Posted: Wed Nov 10, 2010 11:40 pm
by krisp321
ray.wurlod wrote:See if a server Sequential File stage - either in a server job or in a server shared container - can read this file properly. You can use a server shared container in a parallel job. ...
Excellent ray, it worked. Really appreciate that.
Thanks