You should definitely consider paging those records into smaller chunks and
returning number of records that actually client needs (want to display).
If you want just workaround, another interesting strategy you may consider
is encoding data into ZipStream and decompressing them on client. It would
significantly enhance your system performances.

- Goran


On Sat, Jan 24, 2009 at 1:31 AM, Yoooder <[email protected]> wrote:

>
> I'm working on a scenario where I have to transfer a relatively large
> dataset (1000-2000 records) from client to server via web service.
> It's implemented and working in a fairly basic manner, but I'm a bit
> concerned about its resiliancy: low bandwidth, larger records counts,
> and other variables make me concerned that it might just not be a
> beefy enough implementation.
>
> I've considered breaking the data down into smaller chunks; instead of
> 1 webservice call to push 2000 records I may try a loop to push 200
> records at a time.
>
> Does anyone have any advice on moving relatively large datatables
> across a web-service call?
>

Reply via email to