length of text file truncated...
Moderators: chulett, rschirm, roy
length of text file truncated...
Hi All,
i am generating a text file as an Output with the SIze as 1200Bytes.
i have data for the 444 bytes of data and as per the Req I have to append 756 bytes of remaining data with Spaces.
My question is when i hard-code the 756 bytes of data with the Spaces and Run the job and see the text file then in the final text file,i am getting the data truncated.
so in the final output text file instead of getting 1200 bytes of data i get only 953 bytes of data ,somehow the last 247 bytes are being truncated...
Please Help me on how i can see the 1200 byte data as per req...
i am generating a text file as an Output with the SIze as 1200Bytes.
i have data for the 444 bytes of data and as per the Req I have to append 756 bytes of remaining data with Spaces.
My question is when i hard-code the 756 bytes of data with the Spaces and Run the job and see the text file then in the final text file,i am getting the data truncated.
so in the final output text file instead of getting 1200 bytes of data i get only 953 bytes of data ,somehow the last 247 bytes are being truncated...
Please Help me on how i can see the 1200 byte data as per req...
Sri.
As a Datastage developer, you should understand the question...it is common terminology for DataStage and is discussed during DS training and constantly here on the forums.
Yes..the layout/format of the file. As in:
Your answer tells me there are two varchar fields, but tells me nothing about defined lengths (if any), leaving me and others to either make assumptions (which can often be incorrect and is dangerous) or to pry further details from you to build a complete picture of the situation and job design.
As you are apparently working with fixed-length records (is this correct?), have you tried defining the columns as Char instead of VarChar?
Regards,
Yes..the layout/format of the file. As in:
Code: Select all
Field1 VarChar(250)
Field2 VarChar(250)
As you are apparently working with fixed-length records (is this correct?), have you tried defining the columns as Char instead of VarChar?
Regards,
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.
No worries...we all have to start somewhere.
In IS 8.5 I can successfully create a 1200 byte record using the method you described. I'm not aware of a limitation within 7.5 concerning the data length and have no way to test in that version.
Do your input records (containing only one column?) ALWAYS contain exactly 444 bytes of data? Are they in a sequential file or coming from another data source such as a database? If your incoming data is short (<444 bytes), then you probably need to make up the difference in your job so that your output is 1200 bytes.
Regards,
In IS 8.5 I can successfully create a 1200 byte record using the method you described. I'm not aware of a limitation within 7.5 concerning the data length and have no way to test in that version.
Do your input records (containing only one column?) ALWAYS contain exactly 444 bytes of data? Are they in a sequential file or coming from another data source such as a database? If your incoming data is short (<444 bytes), then you probably need to make up the difference in your job so that your output is 1200 bytes.
Regards,
- james wiles
All generalizations are false, including this one - Mark Twain.
All generalizations are false, including this one - Mark Twain.