Web service sizes and reliability

2004-05-20 Thread Burns, John D
I'm getting into using web services more (from Flash and CF) and I'm
curious if there is an amount of data that is considered reliable and
if there is a set amount that is considered too much for web services
to handle.Just as an off the wall example, let's say I have a server
with information in a database (1000 records and about 10 columns).If
I query that information out of the database and want to call a web
service on another machine to pass that data to machine 2 and have it do
some data scrubbing and then insert into a database on that computer, is
that too much to pass via a webservice?Is it better to loop over the
1000 records and pass them to the web service one at a time or is it
better to pass a struct of all 1000 records at once and do the parsing
on the web service side?

 
If anyone has examples of things that are and are not reliable and maybe
some general pointers on when web services are a good thing and when
they might not be the best tool for the job, please share.Thanks!

 
John Burns
 [Todays Threads] 
 [This Message] 
 [Subscription] 
 [Fast Unsubscribe] 
 [User Settings]




Re: Web service sizes and reliability

2004-05-20 Thread Dick Applebaum
John

I have done what you are attempting, many times -- though without web 
services.

Frequently, I would move databases from one remote host to another 
using WDDX.

WDDX and XML get a little slow on big recordsets and the amount of tag 
overhead is typically 256% -- for every 100 characters of actual data, 
you have 256 characters of XML tags.

Other than that, I never never experienced reliability problems with 
the process.The largest db I moved was 33 meg.

Now, some opinion..

In the situation that you describe, the format and structure of the 
data is known in advance by both the publisher and consumer of the web 
service.IMO, WDDX/XML (and it's associated overhead) is not justified 
for this use:

-- the data does not need to be self-defining within the transmitted 
packet

-- the data does not need to be human-readable

Instead, I would use the concept of thinArrays*where the overhead is 
typically 13% (instead of 256%).

This would be much faster, use less bandwidth, and be less exposed to 
reliability outages because of longer transmissions.

With thinArrays, the recordset is encoded into a single string with a 
single-character separator between each cell of data -- this is similar 
to a CF list (with a couple of nuances).The WDDX/XML packet contains 
the header and open/close tags for the thinArray.

You can see/get the thinArray code at Rob Rohan's site.

 http:///www.rohanclan.com/products/neuromancer/

Note:The client-side part of Rob's Neuromancer systemonly runs on 
newer browsers -- Mozilla, Firefox, etc.The server-side (including 
thinArrays) run on anything.

All of the demos use thinArrays and web services.

The source contains:
--server-side CF apps
--server-side CFCs that serialize/deserialize recordsets (CF queries 
and arrays) into thinArrays
--server-sideweb service CFCs to publish/consume the thinArrays
--client-side js functions to consume web services (including 
thinArrays)
--client-side js functions to serialize/deserialize the thinArrays into 
jsrecordsets
--client-side js functions to manipulate js recordsets.
--client-side PIA apps

HTH

Dick



On May 20, 2004, at 6:58 AM, Burns, John D wrote:

 I'm getting into using web services more (from Flash and CF) and I'm
curious if there is an amount of data that is considered reliable 
 and
if there is a set amount that is considered too much for web 
 services
to handle.  Just as an off the wall example, let's say I have a server
with information in a database (1000 records and about 10 
 columns).   If
I query that information out of the database and want to call a web
service on another machine to pass that data to machine 2 and have it 
 do
some data scrubbing and then insert into a database on that computer, 
 is
that too much to pass via a webservice?  Is it better to loop over the
1000 records and pass them to the web service one at a time or is it
better to pass a struct of all 1000 records at once and do the parsing
on the web service side?


If anyone has examples of things that are and are not reliable and 
 maybe
some general pointers on when web services are a good thing and when
they might not be the best tool for the job, please share.  Thanks!


John Burns

 [Todays Threads] 
 [This Message] 
 [Subscription] 
 [Fast Unsubscribe] 
 [User Settings]