This is just an Idea,

Message Queues, - pick your flavor.

Server and Clients have incoming queues.

the server queue thread turns continuously listing to it's incoming queue
and post backs all the updates / insert / deletes to the client queues
(except the one making the update);

the clients connect when they can and pull down as much information
possible from their queue and make the changes. (* it's in order so
shouldn't be a problem).

I suppose there is a way to do it with Sql Server, are all the clients
working with the same version?


Davy,

The US Congress voted Pizza sauce a vegetable. Don't even try to convince
me of anything in the states is sane any more!


On Wed, Mar 27, 2013 at 11:06 AM, Corneliu I. Tusnea <corne...@acorns.com.au
> wrote:

> Greg,
>
> I'm sure the SQL guys will tell you about some "magical tool" that can do
> all of this for you hands free and without any headaches (fingers crossed)
> but my take would be the good old REST API model.
>
> 1. For every Table have two columns LastUpdated, LastUploaded and
> LastDownloaded. Every change you do locally you update the LastUpdated to
> UTC now (never use local times!)
> 2. Keep a table with the "sync status" of each table where all you need to
> store is the TableName, LastUploaded and LastDownloaded.
> 3. Have a background thread that tries to monitor for network events
> (don't continuously try to ping your server as your'll burn the battery of
> those devices).
> http://www.codeproject.com/Articles/64975/Detect-Internet-Network-Availability
> 4. When you have connectivity all you need to do is select top 100 from
> each table where LastUpdatd for the Status for the table < LastUpdated of
> the row.
> (I don't know if I make sense but basically you want to select all the
> rows that were changed since point of your LastUpdated in your Status
> table).
> You then try to push those back to your server. For every row that "made
> it" to the server you update the LastUploaded to UtcNow or even better I
> would update it to the time just before you started the sync.
> 5. You do the reverse for downloading data. You ask the server for all
> changes since your LastDownload. Once all the changes were received, you
> update your own LastDownload.
> With a bit of reflection and some clear naming conventions you could code
> all of this generically enough that you can simply run it on your database
> disregarding the number of tables & columns.
>
> I'm now going to let the SQL guys deliver their magical tool :)
>
> Regards,
> Corneliu.
>
>
>
>
> On Wed, Mar 27, 2013 at 8:41 PM, Greg Harris <harris.gre...@gmail.com>wrote:
>
>> Dear People,
>>
>>
>> I need some help to get some good ideas for a design issue I am facing…
>>
>>
>> The application will be geographically dispersed and only occasionally
>> connected to the internet with a slow / unreliable connection.
>>
>> The users at remote branch offices are doing daily data entry to their
>> own local databases (probably SQL express databases).
>>
>> On a periodic basis the remote branch offices need to synchronise data
>> with head office (probably a full SQL database).
>>
>> Most (99%) of data will travel from the remote branch offices the head
>> office some reference data may travel back to the remote branch office.
>>
>>
>> There are a couple of design ideas that I have had:
>>
>>
>> SQL Server Replication: (
>> http://msdn.microsoft.com/en-us/library/ms151198.aspx) I do not know how
>> well this will work on wires that are of such poor quality.  Also how easy
>> (hard) it will be to support remotely.
>>
>>
>> Program based updates: Have a program running in the background at each
>> site attempting connection with head office transferring data.  All rows
>> would have a transferred status flag, that would be set once successful
>> transfer has been acknowledged.
>>
>>
>> File extracts: Once an hour produce a text file (with check sum) of all
>> data entered in the last hour, background job copies file to head office
>> server which will then apply updates to head office server.
>>
>>
>> Please share with me and the group what design ideas and experiences you
>> have had that worked well and the ones you would avoid if faced with the
>> same design decision again today.
>>
>>
>> Many thanks
>>
>> Greg Harris
>>
>
>

Reply via email to