Hello good folks,

Our php application uses browser clients and magnetic card readers to collect 
workmen's working times from a number of physical construction or project 
sites to the main database at the head office, which is online with a private 
ip-address.

Now we have a situation where not all construction sites' offices are online 
or have so poor / instable connections that we can't rely on them. So we need 
to install a local server to collect the data.

What would be a proper way to load the work time data up to the main database 
when a local server is online again ? 

Also, all the workers and sites are created and maintained at the head office 
so we need to get this data down. There may / will be corrections to the 
uploaded work time data as well.

I imagine all this could be done with unique row identifiers, temp tables and 
shell- / sql-scripts, but why re-invent the wheel ...

All servers are linux + Pg8.2.

I followed the recent thread about 'replication in Postgres' but still any 
info on experience of similar circumstances and pointers / comments / 
recommendations are more than welcome.

BR,
-- 
Aarni Ruuhimäki


---------------------------(end of broadcast)---------------------------
TIP 9: In versions below 8.0, the planner will ignore your desire to
       choose an index scan if your joining column's datatypes do not
       match

Reply via email to