I'm working on a scenario where I have to transfer a relatively large dataset (1000-2000 records) from client to server via web service. It's implemented and working in a fairly basic manner, but I'm a bit concerned about its resiliancy: low bandwidth, larger records counts, and other variables make me concerned that it might just not be a beefy enough implementation.
I've considered breaking the data down into smaller chunks; instead of 1 webservice call to push 2000 records I may try a loop to push 200 records at a time. Does anyone have any advice on moving relatively large datatables across a web-service call?
