Bulk load in db2

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
gowrishankar_h
Participant
Posts: 42
Joined: Wed Dec 26, 2012 1:13 pm

Bulk load in db2

Post by gowrishankar_h »

Hi Datastager,

While iam trying to do bulk load in db2 which is installed on z/os.

I have set Bulk load to db2 on z/os =yes
Load method =mvs dataset

While running the job iam getting the following error.

Db_Grouping,0: Write failed with error 32: Broken pipe (CC_DB2Utils::writeToPipe, file CC_DB2Utils.cpp, line 1,625)
550-SVC99 RETURN CODE=4 S99INFO=0 S99ERROR=38656 HEX=9700 S99ERSN code X'000003E9'.
cat: write error: Broken pipe
(CC_DB2ZLoadRecordDataSetConsumer::complete, file CC_DB2ZLoadRecordDataSetConsumer.cpp, line 3,463)

Please let me know who we can avoid this error and run our job without any issue.


Thanks,
Gowri shankar H
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Hmmm... guessing you'll need to involve your official support provider on that one. Are you using the DB2 Connector?
-craig

"You can never have too many knives" -- Logan Nine Fingers
Mike
Premium Member
Premium Member
Posts: 1021
Joined: Sun Mar 03, 2002 6:01 pm
Location: Tampa, FL

Post by Mike »

Since you seem to be getting some kind of write failure error, make sure that you have the appropriate permissions to create that MVS dataset that you've specified in the load method.

Mike
qt_ky
Premium Member
Premium Member
Posts: 2895
Joined: Wed Aug 03, 2011 6:16 am
Location: USA

Post by qt_ky »

How quickly does it abort? How large does the file get before aborting?

Check for possible operating system file size limits or user-imposed file size limits (i.e. 2 GB limit) on the system where the file is being written. If that's causing the abort, work with your admin to increase or remove the limits.
Choose a job you love, and you will never have to work a day in your life. - Confucius
gowrishankar_h
Participant
Posts: 42
Joined: Wed Dec 26, 2012 1:13 pm

Post by gowrishankar_h »

Yes iam using DB2 Connector stage...I couldnt see in error that says acces or permission denied the error says broken pipeline....actually we do truncate and bulk loak so takes some 1 min to truncate after that within few seconds we are getting this error and not moving further but job is not aborting unless we manually do....
qt_ky
Premium Member
Premium Member
Posts: 2895
Joined: Wed Aug 03, 2011 6:16 am
Location: USA

Post by qt_ky »

I was assuming that the load method mentioned actually lands a temporary data file somewhere in the background. The broken pipe error can possibly be a result of an operating system imposed file size limit (i.e. 2 GB limit). If it aborts within a few seconds then it is probably caused by something else. I'm sure the broken pipe error can have many other causes too. It may still be worth verifying that you are able to create large files OK.
Choose a job you love, and you will never have to work a day in your life. - Confucius
clarcombe
Premium Member
Premium Member
Posts: 515
Joined: Wed Jun 08, 2005 9:54 am
Location: Europe

Post by clarcombe »

We had this error and it took me a week to discover the solution.

ftp was not installed. Check your ftp configuration before continuing to search
Colin Larcombe
-------------------

Certified IBM Infosphere Datastage Developer
Post Reply