Open Hub Extract pack in DatStage

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
Dodge
Participant
Posts: 4
Joined: Sun Apr 18, 2010 4:59 pm

Open Hub Extract pack in DatStage

Post by Dodge »

BW_Open_Hub_Extract_Stage_4_3_2_0,0: Fatal Error: Fatal: BWPACK_E_00028`:`ERROR: Timeout while waiting for BW to start load after process chain starts


Please Help me ASAP, while running the job i am getting this eeroror and the job got aborted.

I created Open Hub, Transforation, DTP and Process chain to execute the DataStage Job.

But when I execute this from SAP BW the job was running.

When I run this job from DatStage, the job get aborted.
KoolKid
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Sorry, but this is a peer-to-peer support site so there's no concept of "ASAP"... that or our concept probably differs radically from what you're looking for.

For that kind of help you should go to your official support provider, that's one of the reasons your company pays them big bucks every year.

And before you ask, no I've got no experience whatsoever with SAP anything so can't really help except (perhaps) to manage your expectations.
-craig

"You can never have too many knives" -- Logan Nine Fingers
asorrell
Posts: 1707
Joined: Fri Apr 04, 2003 2:00 pm
Location: Colleyville, Texas

Post by asorrell »

There are numerous (like at least two dozen) reasons for that error to occur, but assuming the following:

1) You have SAP BW Pack installed and configured correctly and
2) You have a properly configured SAP BW connection in the DataStage Administrator for SAP (note: this includes the fact that the login id for BW must be a SYSTEM-type id)

-then-

You probably have not either pre-run the BW job or updated your BW 3rd party parameters correctly.

Walk through your extract stage in your job - (select connection, data source, job chain and open hub destination). Does it fill in the columns automatically in the stage (if not - the chain needs to be run once on the BW side to setup the Open Hub temp table initially).

Then click on "Update BW" button. If it says "OK". You should be good. Just to confirm, you should see those same three parameters it responds with set on the job chain in BW under "3rd Party Parameters".

Then you will have clear up any debris (left over flags) from the previous bad runs. This is a bug that isn't fixed until 8.1 fix pack 1. When a job aborts it can leave a lot of flags like ".pc" ".linked" and ".started" in the Data directory under the "Connection" directory named for your connection under /opt/IBM/InformationServer/DSBWConnections. Clear out all the old flags.

And no - there's no way to know which BW job left those files around, and if they are really being used - so don't clear them out if other BW jobs are running at the time.

If you've never had any BW extract connection work, then you might have a configuration problem. There are several pages in the manual about both proper configuration (with verification) and RFC tracing that you should check out.
Andy Sorrell
Certified DataStage Consultant
IBM Analytics Champion 2009 - 2020
dump
Participant
Posts: 1
Joined: Wed Jun 09, 2010 9:38 am

SAP vs DS

Post by dump »

Hi all,

we have SAP open hab that download data from 50 billion table - it takes 1 hour - after that DS takes this data - 2 hours - is it correctly from performance point of view ?

Regards,
Ivan
Post Reply