You should definitely consider paging those records into smaller
chunks and returning number of records that actually client needs
(want to display).
If you want just workaround, another interesting strategy you may
consider is encoding data into ZipStream and decompressing them on
client. It would significantly enhance your system performances.

- Goran

On Jan 24, 10:01 pm, Al Longobardi <[email protected]> wrote:
> If you are transfering a lot of data between your database and your client
> application, it means that the design needs a little work. I would try
> implementing some type of paging logic so that you only work with a small
> subset of your main data. For example: display page size would be 20 data
> items instead of thousands of records at a time.
> There are a lot of good articles on the net that you can take a look.
>
> Take a look at this article:
>
> http://www.codeproject.com/KB/aspnet/PagingLarge.aspx
>
> I have been in the same situation and have resorted to paging and using
> session variables to store datasets to improve performance.
>
> Hope this helps,
> Al
>
> On Fri, Jan 23, 2009 at 7:31 PM, Yoooder <[email protected]> wrote:
>
> > I'm working on a scenario where I have to transfer a relatively large
> > dataset (1000-2000 records) from client to server via web service.
> > It's implemented and working in a fairly basic manner, but I'm a bit
> > concerned about its resiliancy: low bandwidth, larger records counts,
> > and other variables make me concerned that it might just not be a
> > beefy enough implementation.
>
> > I've considered breaking the data down into smaller chunks; instead of
> > 1 webservice call to push 2000 records I may try a loop to push 200
> > records at a time.
>
> > Does anyone have any advice on moving relatively large datatables
> > across a web-service call?

Reply via email to