[BackupPC-users] Splitting large files - follow up

2010-02-12 Thread Nigel Kendrick
I got some good advice from you guys on the pros and cons of splitting large
files (MS-SQL database dumps) for backups and in return here's my findings:

 

1)  Backing up one large file or lots of .part files takes about the
same amount of time, as expected, due to rsync only copying changes

2)  Many of the .part files are identical on a day-by-day basis so we
have significantly reduced the amount of disk space the backups take up due
to BackupPC's use of hard-linking

 

The tool we use to split, verify and join the .part files is called Swiss
File Knife (sfk), It's Open Source  available for Windows and Linux. Sfk
has a lot of other useful features too and is well worth a look if you've
not seen it before:

http://stahlworks.com/dev/swiss-file-knife.html 

 

All the best

 

 

--
SOLARIS 10 is the OS for Data Centers - provides features such as DTrace,
Predictive Self Healing and Award Winning ZFS. Get Solaris 10 NOW
http://p.sf.net/sfu/solaris-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Thinking aloud about backup rotation

2010-01-27 Thread Nigel Kendrick


-Original Message-
From: Tino Schwarze [mailto:backuppc.li...@tisc.de] 
Sent: 27 January 2010 11:27
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] Thinking aloud about backup rotation

Hi,

On Wed, Jan 27, 2010 at 10:33:48AM -, PD Support wrote:
 
There is an easier solution: Just don't name the files by weekday -
backup into the same file every day like xyz_db.bak. Then you are free
to
a) copy it somewhere else on the server with a weekday name (and it
doesn't need to be backed up there)
b) just rely on BackupPC for restores

Why would you want to keep one week's worth of backups on the server
itself if BackupPC keeps those backups anyway?

HTH,

Tino.




Hi Tino - I'm just really thinking out loud about options. Local, daily, ZIP
backups will be handy for minor issues as there will be 30+ sites spread
over much of the southern half of the UK, backed up to 5 regional
Linux-based servers. Restoring from a local file (via remote access) will be
quicker than getting the data back to site through ADSL or by car/courier.

We already create a generically-named .bak file and back it up via BackupPC,
but I can see circumstances where the rotation feature might be useful.




--
The Planet: dedicated and managed hosting, cloud storage, colocation
Stay online with enterprise data centers and the best network in the business
Choose flexible plans and management services without long-term contracts
Personal 24x7 support from experience hosting pros just a phone call away.
http://p.sf.net/sfu/theplanet-com
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Commas in filenames - a problem?

2009-11-11 Thread Nigel Kendrick
Hi,
 
I am trying to backup some documents associated with an old app that uses
the filename to associate the documents to customer records - for example,
here's the name of a Word document:
 
Treatment Doc 10185, 051107.DOC
 
In other words: this is a document associated with record 10185, written on
5th Nov 2007.
 
When I back them up from their linux-based host via rsyncd, every one of
these documents throws up a message in the rsyncd log like:
 
2009/11/11 14:52:14 [14349] file has vanished:
/data/common/users/JUPVET_Docs/Text/Treatment Doc 10185, 051107.DOC (in
server)
 
The backup is still running (there are thousands of documents to
check/backup), but is the comma in the name likely to be causing this? I
have only just added the users path to the backup and previous tests backing
up just /etc have worked fine. As the app is being taken out of service, I
may be able to rename all the docs if necessary.
 
Thanks
 
Nigel
 
 
 
 
 

Nigel Kendrick
IT Associate
Pet Doctors Ltd
Pet Doctors House
Drayton Lane, Merston
Chichester, West Sussex
PO20 1EL
Tel (direct): 01555 708 601
Fax: 01243 782 584

General IT support issues should be sent to supp...@petdoctors.co.uk

DISCLAIMER 

This email and any attachments to it may be confidential and are intended
solely for the use of the individual to whom it is addressed. Any views or
opinions expressed are solely those of the author and do not necessarily
represent those of Pet Doctors Limited.

If you are not the intended recipient of this email, you must neither take
any action based upon its contents, nor copy or show it to anyone.

Please contact the sender if you believe you have received this email in
error. 

Pet Doctors Limited is a company registered in England and Wales, company
number 03769799. 
Registered office is Pet Doctors House, Drayton Lane, Merston, Chichester,
West Sussex PO20 1EL 

 
--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Commas in filenames - a problem? UPDATED

2009-11-11 Thread Nigel Kendrick
Bah - Just as I was doing the tests, someone on site was moving all the docs
to another server without telling me so they really had vanished! 
 
Now to see whether they will back up using rsyncd on the Windows server - if
the files stay in one place for long enough!
 
Nigel
 

  _  

From: Nigel Kendrick [mailto:support-li...@petdoctors.co.uk] 
Sent: Wednesday, November 11, 2009 3:26 PM
To: 'General list for user discussion,questions and support'
Subject: [BackupPC-users] Commas in filenames - a problem?


Hi,
 
I am trying to backup some documents associated with an old app that uses
the filename to associate the documents to customer records - for example,
here's the name of a Word document:
 
Treatment Doc 10185, 051107.DOC
 
In other words: this is a document associated with record 10185, written on
5th Nov 2007.
 
When I back them up from their linux-based host via rsyncd, every one of
these documents throws up a message in the rsyncd log like:
 
2009/11/11 14:52:14 [14349] file has vanished:
/data/common/users/JUPVET_Docs/Text/Treatment Doc 10185, 051107.DOC (in
server)
 
The backup is still running (there are thousands of documents to
check/backup), but is the comma in the name likely to be causing this? I
have only just added the users path to the backup and previous tests backing
up just /etc have worked fine. As the app is being taken out of service, I
may be able to rename all the docs if necessary.
 
Thanks
 
Nigel
 
 
 
 
 

Nigel Kendrick
IT Associate
Pet Doctors Ltd
Pet Doctors House
Drayton Lane, Merston
Chichester, West Sussex
PO20 1EL
Tel (direct): 01555 708 601
Fax: 01243 782 584

General IT support issues should be sent to supp...@petdoctors.co.uk

DISCLAIMER 

This email and any attachments to it may be confidential and are intended
solely for the use of the individual to whom it is addressed. Any views or
opinions expressed are solely those of the author and do not necessarily
represent those of Pet Doctors Limited.

If you are not the intended recipient of this email, you must neither take
any action based upon its contents, nor copy or show it to anyone.

Please contact the sender if you believe you have received this email in
error. 

Pet Doctors Limited is a company registered in England and Wales, company
number 03769799. 
Registered office is Pet Doctors House, Drayton Lane, Merston, Chichester,
West Sussex PO20 1EL 

 
--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] BackupPC on OpenSolaris

2009-09-23 Thread Nigel Kendrick
 

-Original Message-
From: Holger Parplies [mailto:wb...@parplies.de] 
Sent: Wednesday, September 23, 2009 9:22 AM
To: linker3...@googlemail.com
Cc: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] BackupPC on OpenSolaris

Hi,

Pedro M. S. Oliveira wrote on 2009-09-23 07:48:50 +0100 [Re:
[BackupPC-users] BackupPC on OpenSolaris]:
 On Tuesday 22 September 2009 20:29:02 Linker3000 wrote:
  [...]
  Looks like things have moved on and this guide needs updating -

this guide being?

  I had to do a lot more work to get the package installer  perl
installed
  and then the link between cgi-bin/index.cgi just didn't go anywhere

Have you got a file BackupPC_Admin anywhere? That is probably what it
should
point to. But you'll also need a few images and a stylesheet to be in the
right place for things to work properly. If you need any more help, you
probably need to give *a lot* more details, such as which web server you are
trying to integrate BackupPC with.
And, of course,

 [...]
 What's failing?

;-). While you may want to use the web interface, it's not a component
involved in actually making backups.

Regards,
Holger



Hi Holger, 

My original post was a comment over at the Web-based forums under here:
http://www.backupcentral.com/phpBB2/two-way-mirrors-of-external-mailing-list
s-3/backuppc-21/backuppc-on-opensolaris-97029/ so it seems out of context
when it replicates to the list.

I followed the Solaris install guidelines but there seemed to be a lot more
to do to get the package installers  C compiler in place in order to load
up all the other required packages and, ultimately, the instructions:

  cd /opt/csw/apache2/share/cgi-bin/
  ln -s BackupPC_Admin index.cgi 

...are ineffective because, as you say, there's no BackupPC_Admin there. I
am sure that a lot of the hassle I have had is down to the fact that I have
not used OpenSolaris before and I am re-doing the installation. I have been
given some advice on IRC (#solaris) and am following it up...


Nigel


--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Anecdote about backup of changing zip files

2009-09-10 Thread Nigel Kendrick
Hi Folks,

Lively conversation!!

A while back, someone asked how the SQL backups are being done - here's my
take:

Two batch files, two stored procedures and a few command line tools.

* The stored procedures were written by an unknown person. One makes a dump
of the given database and verifies it, the other can reindex the specified
table (optional)

* Batch file 2 takes in command line variables that define what
database/table should be backed up and to where, it generates the SQL code
'in the fly' to do this and then calls the stored procedure using the
command-line tool SQLCMD.EXE to make a .bak file (which we sync off site).
The batch file then makes a dayofweek-stamped, zipped copy of the backup,
which we just leave as a local copy. If the batch file is called with a
specific parameter, it reindexes the specified table.

* Batch file 1 feeds the parameters into batch file 2 by calling it (like a
subroutine) with the required command line parameters for each table I want
to dump. This approach makes it easy to change what is backed up according
to site/need. This batch file is started nightly by a Windows scheduled
task.

The whole lot makes use of the command line version of 7zip and a public
domain DOS command called WHAT.COM that returns ERRORLEVEL set to 0-6
according to the day of week.

If anyone wants this 'package' let me know and I'll email it across or put
it up somewhere.

For info, with our new ADSL service in place, 500MB of database backups is
syncing in about 20 mins.

Nigel


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Anecdote about backup of changing zip files

2009-09-07 Thread Nigel Kendrick
 
Thanks for the stats on ZIPped MS-SQL db files - that has saved me doing
some tests!

I will eventually have to backup MS_SQL servers on 31 sites to a number of
remote locations and so I am currently experimenting with a number of
strategies.

At the moment, I am backing up as follows:

1) A nightly scheduled batch file runs SQL scripts to dump the tables by
calling a stored procedure

Backups are called backup_tablename_dayofweek.bak (eg:
backup_testdb_wed.bak)

2) The .bak file is renamed to backup_tablename.bak to create a daily
generic backup file which is synced off-site by BackupPC

3) The .bak file is ZIPped to !B_tablename_dayofweek.zip (eg:
!B_testdb_Wed.zip) and this is left in the backup folder as a local copy.
BackupPC is set to ignore files that start !B* so these files are not copied
offsite but are kept for a week until overwritten.

It's not easy to determine actual backup speed and performance yet because
we have only rolled out the new app on two sites, and the devs are messing
around with the database schema so a lot of the data in the .bak file is
changing between backups. During initial testing (when things were not being
changed so much), a daily sync of a 700MB database was taking around 20-40
minutes, albeit across an old 512K ADSL (VPN) line with an upload speed of
288Kbit/sec. I have just upgraded this to an LLU service that's currently
running at 5.5Mbit down, 883Kbit up and will be seeing what improvement this
gives.

Our 700MB .bak files ZIP down to around 130MB and I was wondering whether it
would be worth taking this offsite, but it may be that syncing the raw dumps
may be quicker.

Nigel




--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Rsyncd --sparse flag

2009-08-28 Thread Nigel Kendrick
Hi,
 
Does backuppc support the --sparse flag for rsyncd remote backups -
searching for answers led me to 'probably not' in an old post.
 
If it is supported, any benefit of using it with my famous database backup
dumps?
 
Thanks
 
Nigel
--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Running BackupPC on a Pentium III/900 MHz

2009-08-27 Thread Nigel Kendrick
 

-Original Message-
From: Koen Linders [mailto:koen.lind...@koca.be] 
Sent: Thursday, August 27, 2009 6:39 AM
To: 'General list for user discussion,questions and support'
Subject: Re: [BackupPC-users] Running BackupPC on a Pentium III/900 MHz

Hello,

for a small office with 2 PCs I would like to setup a samba and
backup-server (using backuppc).

I'm running BackupPC on three other servers (Pentium 4, 2.4 GHz).

Since I have an unused Pentium III running at 900 MHz and 512 MB RAM I
would like to know, if this small machine can be used as a BackupPC-Server.

Why: It is very quiet and has a nice form factor.

Any feedback?

Kind regards

- P hil



I have a Dell PowerEdge 1300 (Dual PIII-450) with 640MB RAM running Backuppc
with no problems. The server backs up 3 others across a ADSL VPNs.

Nigel



--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] A Question on full backups

2009-08-26 Thread Nigel Kendrick
Hi,
 
I have had a read, but I am none the wiser...
 
I understand that incremental backups are hard-linked delta copies based on
the last full backup, but when a new full backup is performed, is this done
as a complete new transfer, or are full backups also built from deltas of
the last full backup and/or incrementals?
 
Thanks

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] A Question on full backups

2009-08-26 Thread Nigel Kendrick

-Original Message-
From: Les Mikesell [mailto:lesmikes...@gmail.com] 
Sent: Wednesday, August 26, 2009 3:58 PM
To: General list for user discussion,questions and support
Subject: Re: [BackupPC-users] A Question on full backups

Nigel Kendrick wrote:
 Hi,
  
 I have had a read, but I am none the wiser...
  
 I understand that incremental backups are hard-linked delta copies based 
 on the last full backup, but when a new full backup is performed, is 
 this done as a complete new transfer, or are full backups also built 
 from deltas of the last full backup and/or incrementals?

You need to separate the xfer method from the storage to understand this.
With 
tar or smb methods, the entire contents of a full are transfered.  With
rsync or 
rsyncd, this is a delta against the previous full (or incrementals if using 
levels) but if any content has changed in a file, a complete new copy is 
reconstructed on the backuppc server side.  Then, regardless of the method,
all 
content that matches a file in the pool is discarded and replaced with a 
hardlink - and all new/changed files have a new link added to the pool
(there 
are no per-file deltas stored).

-- 
   Les Mikesell
lesmikes...@gmail.com



Thanks Les, 

Yes, I meant this with reference to rsync/rsyncd.

Just going back to my previous query about SQL dumps and my daily backups
transferring the whole lot, I now realise that the last full backup was done
when the system was under test and I had not yet included the SQL dump
folder. The dump folder (700MB) was then picked up by the first incremental
and has been transferred ever since because, as I now know, same-level
incremental deltas are done against the last full backup and not between
incrementals.

If my understanding is correct, I presume I should then see a marked
reduction in the size of incrementals after the next full backup?

Nigel


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Anyone with experience of MS-SQL database dumps?

2009-08-25 Thread Nigel Kendrick
Hi,
 
This may also be applicable to MySQL etc..
 
I have an MS-SQL database that dumps out a backup that's around 700MB in
size. BackupPC brought this over to its server via ADSL/VPN/Rsyncd
(422mins!) and when I did another data dump about 4 hours later, around 16Mb
of changes to the file were transferred very quickly. However, every single
backup since then has gone back to 422 minutes. I could imagine that the
database and dump will change over time, but it seems strange that the
entire file needs copying every time - as if its structure is always 100%
different from the previous. 
 
I wondered if I was missing anything obvious (being a BackupPC noob) like
file time stamping causing a complete file transfer every day, or are
database dumps likely to be completely different every time?
 
I will keep some copies of the dumps and do some comparisons but in the
meantime any tips, thoughts etc.?
 
Thanks
 
Nigel Kendrick

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] FW: Centos Install not working - PROGRESS!

2009-08-22 Thread Nigel Kendrick
I am forwarding the message below to the list as the original reply just
went to Holger's personal address (sorry!)

BUT

I have now done some more investigation and found that this is not an suexec
issue but a 'perl taint' one. 

To cut a long story short, I found that the backuppc service start/stop code
in /etc/init.d has a hard-coded reference to the 'backuppc' user, but I am
running all my code as 'backuppc2' - a user I created with a high uid:gid to
satisfy suexec. 

Now that I have changed this reference to 'backuppc2', I am 22GB into a
backup so things are looking hopeful!

This also explained why I was having problems stopping/restarting backuppc -
I was getting errors reading logs and other files in /var/log/BackupPC and
had noticed the files were created with a uid:gid of backuppc2:backuppc

Would it be simple to change the start/stop script for the next release so
it picked up the right user from config.pl?

Nigel

**

Hi,

Nigel Kendrick wrote on 2009-08-22 01:08:23 +0100 [[BackupPC-users] Centos
Install not working,]:
 [...]
 I have slogged through various errors to get Backuppc working with suexec
 and although I am close, all I get when I try and run the app is a listing
 of the cgi code rather than it being executed. [...]
  
 NameVirtualHost 192.168.200.11:80
  
 VirtualHost 192.168.200.11:80
ServerName [snipped]
ServerAlias [snipped]
DocumentRoot /var/www/html
SuexecUserGroup backuppc2 backuppc2
ScriptAlias /backuppc/cgi-bin/ /var/www/html/backuppc/cgi-bin/
 /VirtualHost
  
 Directory /var/www/html/backuppc/cgi-bin
Options +ExecCGI

I'm no Apache expert, but the Debian package contains the lines

 AddHandler cgi-script .cgi
 DirectoryIndex index.cgi

and names the executable index.cgi. As Apache obviously does not recognize
the
script as an executable, that might be something that could help (the second
line is only so that the directory name without the index.cgi will also
lead to execution of the script, I believe. That makes sense especially if
the
cgi-bin directory is actually the DocumentRoot of the VirtualHost ...).

[Snip]




Hi Holger,

Many thanks. The cgi lines put me on the right track, together with changing
some group permissions in a folder I'd missed.

I can now get the Web interface running and I can configure things, but jobs
don't run:

2009-08-22 12:32:48 User admin requested backup of accounts (accounts)
Insecure dependency in exec while running setuid at /usr/bin/BackupPC line
709.

I think this a problem with suexec and the fact that some of the code is in
/usr/lib/BackuPC rather than under the Web document root, but looking at
config.pl, changing this needs a reinstall or lots of references in
othermodules changed:

#   InstallDir - where the bin, lib and doc installation dirs reside.
#Note: you cannot change this value since all the
#perl scripts include this path.  You must reinstall
#with configure.pl to change InstallDir.

I am going to take a break from this as it's eating into my weekend, but any
help from anyone appreciated.

Nigel


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Centos Install not working,

2009-08-21 Thread Nigel Kendrick
Hi,
 
I have a server running Centos 5.3 and SugarCRM, a Web-based app. As
suspected, if I run the Web server as the backuppc user SugarCRM stops
working.
 
I have slogged through various errors to get Backuppc working with suexec
and although I am close, all I get when I try and run the app is a listing
of the cgi code rather than it being executed. I suspect this an httpd
config issue but having gone round in circles for several hours I am either
missing the obvious or have a fundamental error somewhere.
 
I have put the Web code in /var/www/html/backuppc and pointed config.pl to
the cgi-bin folder below there and here's my httpd conf file based on some
notes about running Backuppc in a virtual host to make suexec work. If I run
the backup admin perl code at the command line as either root or backuppc2 I
do get html output
 
NameVirtualHost 192.168.200.11:80
 
VirtualHost 192.168.200.11:80
   ServerName [snipped]
   ServerAlias [snipped]
   DocumentRoot /var/www/html
   SuexecUserGroup backuppc2 backuppc2
   ScriptAlias /backuppc/cgi-bin/ /var/www/html/backuppc/cgi-bin/
/VirtualHost
 
Directory /var/www/html/backuppc/cgi-bin
   Options +ExecCGI
   Order deny,allow
   Deny from all
   Allow from 192.168.0
   Allow from 127.0.0.1
   Allow from all
   AuthName Backup Admin
   AuthType Basic
   AuthUserFile /var/lib/backuppc/passwd/htpasswd
   Require valid-user
/Directory

Any ideas
 
Thanks
 

Nigel Kendrick


 
--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] spot-checking backups and rsync on cygwin

2009-08-19 Thread Nigel Kendrick
Morning,
 
Happy to report that first MSSQL 700MB database dump across ADSL took 433
mins to complete and resulted in a 130MB compressed file, but the night's
first full backup took 22mins to sync the changes!
 
A few things, then I'll leave you all in peace for a while (maybe!):
 
1) Spot-checking backups:
 
  Any in-built or documented way to generate or view the MD5 hash of a
backed-up file so that files in a backup set can be spot-checked against the
MD5 of the original?
   -- if not, would be a nice feature to be able to click a button or link
to ask the for hash of a file - I presume this would have to trigger a
temporary un-compress and MD5 generation procedure
 
2) The FAQ states Rsync running on Cygwin is limited to either 2GB or 4GB
file sizes. 

Does this still hold true (page last modified in 2006) - I am concerned
because our database dumps are going to grow!
 
3) I have a very sturdy Dell PowerEdge 1300 - a dual PIII-450 box - that
*could* be a backup server, but am I expecting too much for the horsepower?
This box would sit on a remote site and backup around 10 servers via ADSL
VPNs

Thanks again - have to say that based on feedback so far you are a friendly
bunch! Hope I can contribute to the project in some way.

Nigel


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] spot-checking backups and rsync on cygwin

2009-08-19 Thread Nigel Kendrick
Cheers Adam,

All our sites use hardware VPNs between Draytek routers so that load's taken
care of.

Looks like scheduling's the key here so as you say it's a case of try it and
see.

With reference to the possible cygwin/rsync file size limit, I am looking at
a command-line file splitter that would be useful if needed - I found this
one and thought I'd share because it seems to have a ton of useful other
functions in a single executable:

Swiss File Knife

Windows+Linux file tree processor, binary grep, tree size list, instant ftp
server, line filter, text replace, dupfind, join files, md5 lists, run
command on all files, extract strings, detab, patch, tail, hexdump. no
installation, ideal for usb stick.

http://sourceforge.net/projects/swissfileknife/  

Nigel



--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] New user- loads of questions

2009-08-18 Thread Nigel Kendrick
Morning,
 
I have just started to play with backuppc and am making good strides - local
(SMB) backups are working fine and I am just about to have a look at
rsync-based backups from a couple of local Linux servers before moving on to
SMB/rsync via SSH and some VPNs.
 
I am diligently RTFM-ing, supplemented with the stuff found via Google -
which is a bit overwhelming, so I'd appreciate some short cuts from anyone
with a bit more real-world experience if possible:
 
1) I presume(?) SMB-based backups cannot do block-difference-level copies
like rsync? We have a number of remote (over VPN) Windows servers and I'd
like to backup their MSSQL database dumps - they are around 700MB at the
moment and I presume via SMB the whole lot will get transferred every time?
 
2) I have seen a number of guides for cwrsync on Windows-based PCs. Any
votes on the best one and the best place to read up on this? I presume that
since we'd be backing up via VPN, we could run rsync directly rather than
via an SSH tunnel?
 
3) As the remote sites are linked via VPN, I could mount the remote shares
to the local backup server and use rsync 'directly' - any pros/cons doing
things this way (speed, reliability etc?), or is an rsync server on the
remote servers a better approach?
 
4) I am running the backup server on CentOS 5.3 and installed backuppc from
the Centos RPM. Ideally I'd like to run the app as the normal 'apache' user
- I read up on a few generic notes about doing this and got to a point where
backuppc wouldn't start properly as it couldn't create the LOG file. I then
went round in circles looking at file permissions before putting things back
the way they were in order to do some more learning. Is there a
simple-to-follow guide for setting up backuppc to not use mod_perl - I have
read the docs but am still not getting there.
 
Many thanks
 

Nigel Kendrick


 
--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New user- loads of questions

2009-08-18 Thread Nigel Kendrick
 
Holger - thanks for the quick feedback - a few comments and answers below:

-Original Message-
From: Holger Parplies [mailto:wb...@parplies.de] 
Sent: Tuesday, August 18, 2009 2:49 PM
To: Nigel Kendrick
Cc: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC-users] New user- loads of questions

Hi,

Nigel Kendrick wrote on 2009-08-18 12:04:16 +0100 [[BackupPC-users] New
user- loads of questions]:
 I have just started to play with backuppc and am making good strides -
local
 (SMB) backups are working fine

I hope you mean SMB backups of local Windoze machines, not of the BackupPC
server ;-). In any case, welcome to BackupPC.

  -- Yes, backing up Windows machines on the LAN via SMB


 [...]
 1) I presume(?) SMB-based backups cannot do block-difference-level copies
 like rsync? We have a number of remote (over VPN) Windows servers and I'd
 like to backup their MSSQL database dumps - they are around 700MB at the
 moment and I presume via SMB the whole lot will get transferred every
time?

Correct. I'm not sure how well rsync will handle database dumps, though. You
should try that out manually (if you haven't done so already). Please also
remember that BackupPC will store each version independently, though
possibly
compressed (i.e. BackupPC only does file-level deduplication, not
block-level).
You only save bandwidth with rsync on transfer, not on storage.

  -- Thanks, it's as I thought with SMB (all or nothing transfers). 
  -- Got 2TB of RAID 1 to play with so storage not an issue!

 2) I have seen a number of guides for cwrsync on Windows-based PCs. Any
 votes on the best one and the best place to read up on this? I presume
that
 since we'd be backing up via VPN, we could run rsync directly rather than
 via an SSH tunnel?

As far as I know, rsync doesn't work correctly on Windoze (rsyncd does,
though). With a VPN, I'd definitely recommend plain rsyncd. I don't backup
Windoze myself, but Deltacopy is mentioned often on the list - there's a
thread from today [rsyncd on Vista 64-bit cygwin vs SUA] which you might
want
to check out.

  -- Already started working with cwrsync/rsyncd and grabbed some files
from a local Win2K machine. 
  -- Going to try across the VPN later. Looking a 700MB MSSQL database
dumps - hoping to be pleased!
  -- Just subscribed to the list so only seeing posts from around mid-day
onwards but will check the archives.

 3) As the remote sites are linked via VPN, I could mount the remote shares
 to the local backup server and use rsync 'directly' - any pros/cons doing
 things this way (speed, reliability etc?), or is an rsync server on the
 remote servers a better approach?

If you mount the remote shares locally, you lose the benefit of the rsync
protocol *completely*, because the remote rsync instance is running on the
local computer and will need to read each whole file over the network in
order
to figure out which blocks don't need to be transferred (locally) 

[snip]

  -- Thanks, seems like rsyncd over the VPN is the way to go. 
  -- Also looks like rsync is more tolerant of high VPN latency


 4) I am running the backup server on CentOS 5.3 and installed backuppc
from
 the Centos RPM. Ideally I'd like to run the app as the normal 'apache'
user
 - I read up on a few generic notes about doing this and got to a point
where
 backuppc wouldn't start properly as it couldn't create the LOG file. I
then
 went round in circles looking at file permissions before putting things
back
 the way they were in order to do some more learning. Is there a
 simple-to-follow guide for setting up backuppc to not use mod_perl - I
have
 read the docs but am still not getting there.

I believe the default *is* for BackupPC to *not* use mod_perl. Your RPM may
differ, but the upstream documentation will not reflect this.

The BackupPC CGI script needs to be run as backuppc user for various reasons
(access to the pool FS, access to the BackupPC server daemon, use of the
BackupPC Perl library), so you can either run the web server as backuppc
user
or implement some form of changing UID (the CGI script - BackupPC_Admin (or
index.cgi on Debian, don't know about Centos) - is normally setuid backuppc,
but that can't work with mod_perl, I believe).

Do you have a reason for not wanting to run apache as backuppc user

  -- May not be an issue, but I have one server running SugarCRM in a 9-5
operation and
  am planning to have the server do some overnight backups of LAN-based
machines and I am 
  just pre-empting this upsetting SugarCRM - it may not.

  -- I have another that's a small Asterisk (Trixbox) server (again, 9-5
only), where Apache has to be run as 'trixbox' and I am wondering how this
may all fit together!


Thanks again,

Nigel


--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core