Webservice call for larger volume records

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
bj_ds7
Premium Member
Premium Member
Posts: 32
Joined: Fri Dec 13, 2013 2:26 pm

Webservice call for larger volume records

Post by bj_ds7 »

In one of our business scenario, there is a requirement to call webservice for all the input records (volume 2 million). And the WebService responds data has to be stored database. What is best way to implement this scenario in terms of performance ? Will using webservice transformer be right choice ? (source --> webservice transformer --> target. ) Please suggest.
Thanks & Regards!
BJ
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

2 million rows? How big?

That may or may not be huge, but normally, no ..... Web service protocols (SOAP) aren't the best for large quantities of data.....one preferred method I'd to use a web service to kick off a secure ftp file transfer or other secure protocol better suited to large volumes.

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
bj_ds7
Premium Member
Premium Member
Posts: 32
Joined: Fri Dec 13, 2013 2:26 pm

Post by bj_ds7 »

Thanks Ernie for reply. In my scenario i have sent those records to a live running App.Will there be any other option which let us to call webservice by creating request file for batch records ,instead of calling one by one using webservice stage ?
Thanks & Regards!
BJ
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

That would depend entirely on the author of the service.
-craig

"You can never have too many knives" -- Logan Nine Fingers
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

As Craig notes, the author, if expecting to receive large quantities of data, could approach it a lot of ways --- via arrays (so that at least you aren't making 2 million separate calls)....increase the reliability and reduce the "chattiness", perhaps by having you send 5,000, 10,000 or other "n" number of rows at a time (you package them beforehand in a single xml document)........or via other techniques, depending on how they want to send them, their expectations for security and performance, network reliability, etc.

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
bj_ds7
Premium Member
Premium Member
Posts: 32
Joined: Fri Dec 13, 2013 2:26 pm

Post by bj_ds7 »

Thanks Craig and Ernie.:)
Thanks & Regards!
BJ
Post Reply