Page 1 of 1

reading sequential file as a single field

Posted: Fri Dec 15, 2006 12:49 am
by dr46014
hi all...
in my last post i asked one question...as per the post i approached the solution but now i m facing a problem
while reading the whole sequential file as a single field as am facing a problem.an error message is showing there.my file has some rows as blank and some rows have data in it.i need to read the whole file as one field.
my file looks like this

name of the company
*******************
*blank spaces here*

generation date :12/10/2006
*blank spaces here*
1256~name1~account_no~ aghik@yahoo.com~12/15/2005
1562~name2~account_no~ aghik@yahoo.com~12/15/2005
1563~name3~account_no~ aghik@yahoo.com~12/15/2005

generation date :12/10/2006
*blank spaces here*
1256~name1~account_no~ aghik@yahoo.com~12/15/2005
1562~name2~account_no~ aghik@yahoo.com~12/15/2005
1563~name3~account_no~ aghik@yahoo.com~12/15/2005

i have choosen the fixed width type and terminator to yes...and also reading it as char and nullable is set to yes.
plz help me how to read the file

Posted: Fri Dec 15, 2006 12:55 am
by kumar_s
Dont use it as Fixed width. Change it to varchar and have line terminator as none. Pretty much thats it.

Posted: Fri Dec 15, 2006 1:01 am
by dr46014
i dont find any option as line terminator in DataStage..can you please tell me where is that option.
while importing the file there is no option as such... :shock:

Posted: Fri Dec 15, 2006 1:09 am
by BalageBaju
Hi,

Check in the Stage Properties of the Sequential File, In that you can find that Line Termination:
Unix Style
Dos Style
None

In those select the opton None.

This is what Kumar trying to say...

Posted: Fri Dec 15, 2006 1:09 am
by narasimha
Uncheck the fixed width, set terminator to No

Now change the Delimiter to 000 in the Format tab.

You should be able to read it into one field

Posted: Fri Dec 15, 2006 1:12 am
by dr46014
its working now!!!!!!!11
thanks for ur valuable help

Posted: Fri Dec 15, 2006 1:16 am
by dr46014
one more help...i need that i should read the date field and pass it as a stage variable to load it into the table..
i want to read only the records not the company name,the ********s and the blank line...
how should i do that

Posted: Fri Dec 15, 2006 1:25 am
by BalageBaju
Hi,

Try with Field Function.

Code: Select all

  Field(i/p_col_name,~,4)
This will give you only the date part, so you can store it in a table.

Hope this will help you.

Posted: Fri Dec 15, 2006 1:30 am
by dr46014
can u plz explan wat that code means..
i want the records in another flat files depending on the creation date

Posted: Fri Dec 15, 2006 1:51 am
by narasimha
dr46014 wrote:can u plz explan wat that code means..
Not to discourage you from asking questions....
You need to do a little of searchon the forum for your answers.
If you don't find solutions, then ask for it.

Posted: Fri Dec 15, 2006 2:34 am
by I_Server_Whale
You can also find more information about the DataStage functions/transforms/routines in the DataStage Help Menu.
dr46014 wrote: can u plz explan wat
A piece of advice.
u, plz, wat, ur, etc., don't belong to english language. It is always advisable to write your posts using words the right way.

Thanks,
Whale.

Posted: Fri Dec 15, 2006 7:11 am
by DSguru2B
dr46014 wrote:one more help...i need that i should read the date field and pass it as a stage variable to load it into the table..
i want to read only the records not the company name,the ********s and the blank line...
how should i do that
I thought i gave you the solution in your other post. You will have blank lines in your target file but then stick another transformer next to it and put a constraint as i specified in my post in the link i provided. That will work. Whats not working there ?

Posted: Sat Dec 16, 2006 6:47 pm
by Raghavendra
Are you able to solve your problem.

Posted: Sun Dec 17, 2006 1:56 am
by dr46014
yes i got the solution :)

Posted: Sun Dec 17, 2006 2:59 am
by kumar_s
Care to specify the solution that you had opted and mark the topic as resolved, so the search function can be more efficient for others.