Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.
Moderators: chulett , rschirm , roy
Nagaraj
Premium Member
Posts: 383 Joined: Thu Nov 08, 2007 12:32 am
Location: Bangalore
Post
by Nagaraj » Tue Jun 14, 2011 2:44 pm
I am facing the same error as seen in
this post, i have left the defualt array size i.e 2000
ray.wurlod
Participant
Posts: 54607 Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:
Post
by ray.wurlod » Tue Jun 14, 2011 3:14 pm
Almost the anniversary of the original question!
Are your system settings identical to those? If not, please begin a new thread.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Nagaraj
Premium Member
Posts: 383 Joined: Thu Nov 08, 2007 12:32 am
Location: Bangalore
Post
by Nagaraj » Tue Jun 14, 2011 3:34 pm
System settings are identical, hence i will continue this thread.
btw my job ran successully with the following changes, with source being 7k rows
i set the record count and the array size to 10000,
but i want to know how these settings affect the job,
how i have to deal with these?
here's the defination i found on manual.
The array size
specifies the number of records to include in each batch that the read and write
operations on the database process.
The record count specifies the number of
records to process in each transaction.
Nagaraj
Premium Member
Posts: 383 Joined: Thu Nov 08, 2007 12:32 am
Location: Bangalore
Post
by Nagaraj » Wed Jun 15, 2011 8:09 am
so why it is failing if i leave it as default 2000rows and source being 7000 rows?
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Wed Jun 15, 2011 9:30 am
Me, I don't care how "identical" your specs are - this is your issue so off you go to your own post. I did however link back to the original post.
-craig
"You can never have too many knives" -- Logan Nine Fingers
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Wed Jun 15, 2011 9:33 am
Nagaraj wrote: so why it is failing if i leave it as default 2000 rows and source being 7000 rows?
Because this -> default 2000 rows
Has nothing to do with this -> source being 7000 rows
Lower your array size, does the error go away?
-craig
"You can never have too many knives" -- Logan Nine Fingers
Nagaraj
Premium Member
Posts: 383 Joined: Thu Nov 08, 2007 12:32 am
Location: Bangalore
Post
by Nagaraj » Wed Jun 15, 2011 12:24 pm
Okay here are the results after reducing < 2000
option 1 >> set to 1000
Result >>> Success
Time >>> 3:26
option 2 >> set to 100
Result >>> Success
Time >>> 3:14
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Wed Jun 15, 2011 1:06 pm
Bigger is not always better.
-craig
"You can never have too many knives" -- Logan Nine Fingers
Nagaraj
Premium Member
Posts: 383 Joined: Thu Nov 08, 2007 12:32 am
Location: Bangalore
Post
by Nagaraj » Wed Jun 15, 2011 1:08 pm
but chullet can you explain why it failed in the first place with the default value of 2000?
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Wed Jun 15, 2011 1:16 pm
Did you do any searches for your "ORA-24381: error(s) in array DML" error that I assume you got based on the linked post? It's not about how many records are in the array but the total size - so that times the record length. For longer records, less will fit at a time.
Of course there are ways to increase all of that but it seems like you are dealing with default sizes.
-craig
"You can never have too many knives" -- Logan Nine Fingers
Nagaraj
Premium Member
Posts: 383 Joined: Thu Nov 08, 2007 12:32 am
Location: Bangalore
Post
by Nagaraj » Wed Jun 15, 2011 2:01 pm
i just noticed there are two columns with sizes VARCHAR(1000) and VARCHAR(200),
i calculated the sum of all the columns length it reached around 1800.
what shd be the appropriate array size and record count? assuming we get data in all the columns for all the rows(example 7000 rows)
chulett
Charter Member
Posts: 43085 Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO
Post
by chulett » Wed Jun 15, 2011 2:22 pm
Those are the kind of conversations you should be having with your DBA.
-craig
"You can never have too many knives" -- Logan Nine Fingers
Nagaraj
Premium Member
Posts: 383 Joined: Thu Nov 08, 2007 12:32 am
Location: Bangalore
Post
by Nagaraj » Wed Jun 15, 2011 3:07 pm
gr8 thanks...!
Slayer14
Premium Member
Posts: 2 Joined: Mon Jan 14, 2013 12:07 pm
Post
by Slayer14 » Thu Jan 24, 2013 11:36 am
This is marked as resolved, but the above discussion did not help that much.
However, I did find a solution by keeping my record count at the default 2000 option, and my array size to 100,000. This solved the issue if anyone runs into this again.
My total record count was ~1 million.
Lisa