Problem with LongVarchar datatype in DataStage

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
loveojha2
Participant
Posts: 362
Joined: Thu May 26, 2005 12:59 am

Problem with LongVarchar datatype in DataStage

Post by loveojha2 »

Hi Forum,
Firstly, Thank you all for helping me so far. Its simply gr8 :)

Secondly: I have searched the forum before posting this problem, Igot lots of matches, but nothing helped.(keyword I used Longvarchar)

Now the problem, I have a job in which I am populating the contents of a SQL Server table to a file (Sequential File). One of the fields is a text in SQL. When I import the table's metadata it converts it to longvarchar2147483647. I ran the job it gave me error
Test_It..Dynamic_RDBMS_1: Error occurred during link open processing.
Test_It..Dynamic_RDBMS_1.DSLink2: DSP.Open GCI $DSP.Open error -100.
Test_It..Sequential_File_0.IDENT1: |Test_It..Dynamic_RDBMS_1.DSLink2: DSP.Open GCI $DSP.Open error -100.|


Morethan that when I tried to modify the size of this column it shows me only 214748364 instead of 2147483647. The moment I get out of the column it again show 2147483647. (Is it a known bug) :?: .

Then I modified the column size to 214748364, it worked fine.
What's the problem with 2147483647? Any suggestions!

Do we have any global settings for datatype longvarchar's max size that I need to modify?, I didn't find it in uvconfig.

Thanks in advance.
kcbland
Participant
Posts: 5208
Joined: Wed Jan 15, 2003 8:56 am
Location: Lutz, FL
Contact:

Post by kcbland »

I can't speak to this problem specifically, but what I'll point out is that the GUI has a limitation on how many characters you can enter for the length of the field. What you are seeing is this issue.

When you shortened the length of the field, the job stopped complaining because you were binding to smaller size, which is okay up until a column actually exceeds that width. Then it becomes an issue as to how the particular stage you are using reacts. Each stage seems to be different, some complain, others don't.

You're probably fine leaving things the way they are with the smaller value. When a huge column value comes thru of over 2MB, your job will choke on that value and slow down tremendously because of the incrase in data moving.

Just keep an eye on it. You should avoid moving large text columns thru DS, as the in memory juggling increases RAM usage dramatically and slows performance.
Kenneth Bland

Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
jdmiceli
Premium Member
Premium Member
Posts: 309
Joined: Wed Feb 22, 2006 10:03 am
Location: Urbandale, IA

Same problem

Post by jdmiceli »

I am having exactly the same problem with this datatype. I have also gone so far as to try different database connectors. I figured the MSOLEDB would at least be able to handle it's own datatypes :oops: Guess I assumed too much. Is there anyway around this other than truncating data by using the previously mentioned solution?

Thanks much from another newbie,
Bestest!
Bestest!

John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services


"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
Post Reply