Re: [BackupPC-users] Recommended CYGWIN SSH Rsync guide

2013-12-03 Thread John Habermann
Just in case you haven't come across it I have found Michael's windows
Backuppc client http://www.michaelstowe.com/backuppc/ works quite well
for backing up windows servers using rsync. It is using rsyncd rather
than rsync over ssh so might not be of use to you for that reason.

On Tue, 3 Dec 2013 10:10:57 -0500
Henry Burroughs hburrou...@stjohnseagles.org wrote:

 I am in the process of trying to setup CYGWIN SSH  Rsync to backup my
 Windows Servers.  Right now my problem appears to be a privilege
 issue. The user I am using eagles\backuppc has all the correct
 permissions when using CYGWIN locally on the machine.  However when I
 use SSH and have privilege separation enabled, the user is unable to
 access certain folders with a Permission Denied message.
 
 I was using this as a guide:
 http://www.cs.umd.edu/~cdunne/projs/backuppc_guide.html#Cygwin
 Installation
 
 
 I am also attempting to script the whole installation using Wizard's
 Apprentice, Batch, and the 7-zip SFX installer.  That way it will
 walk you through the whole CYGWIN, SSH, and RSYNC configuration.  If
 I ever get it finished and tested on all my servers, I'll post it
 somewhere.
 


-- 
John Habermann|Senior IT Officer|Corporate Services

Cook Shire Council

Phone|07 4069 5444   Fax|07 4069 5423   

Email|jhaberm...@cook.qld.gov.au mailto:jhaberm...@cook.qld.gov.au
Website|www.cook.qld.gov.au http://www.cook.qld.gov.au/ 

Address|10 Furneaux Street (PO Box 3), Cooktown, Qld, 4895

 This email transmission is intended only for the use of the person or
 entity to which it is addressed and may contain information that is
 privileged, confidential and exempt from disclosure by law.  If you are
 not the intended recipient, any dissemination, distribution or copying
 of this transaction is strictly prohibited.  The sender accepts no
 responsibility for any malware, grey ware, spy ware, viral applications
 or code accompanying this transmission.  The person or entity in
 receipt of this transmission indemnifies the sender (be they
 individual or corporation) against any loss incurred through the
 receipt/acceptance/clearance/opening of this transmission.  If you have
 received this transmission in error, please notify us immediately by
 email, facsimile or telephone and disregard the email.

--
Sponsored by Intel(R) XDK 
Develop, test and display web and hybrid apps with a single code base.
Download it for free now!
http://pubads.g.doubleclick.net/gampad/clk?id=111408631iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Outlook files need to be backed up

2013-08-08 Thread John Habermann
Do you have roaming profiles or archiving set up in Outlook that
creates archive.pst files on your server shares? That might be what
BackupPC is picking up if you are using samba as your backup method and
those pst files are in use. Have a look through the error logs for that
server in backuppc and you should see what pst files are triggering the
error. 

On Thu, 08 Aug 2013 03:06:53 -0700
vano backuppc-fo...@backupcentral.com wrote:

 Hello. We have backuppc 3.2.1 under Ubuntu 12.04. I get a warning
 messages from our windows servers: BackupPC: Outlook files
 on ..server.. need to be backed up But this is just Windows servers
 with no any use of e-mail. How I could disable this warnings ?
 
 +--
 |This was sent by v...@qrz.com via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--
 
 
 
 --
 Get 100% visibility into Java/.NET code with AppDynamics Lite!
 It's a free troubleshooting tool designed for production.
 Get down to code-level detail for bottlenecks, with 2% overhead. 
 Download for free and get started troubleshooting in minutes. 
 http://pubads.g.doubleclick.net/gampad/clk?id=48897031iu=/4140/ostg.clktrk
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Get 100% visibility into Java/.NET code with AppDynamics Lite!
It's a free troubleshooting tool designed for production.
Get down to code-level detail for bottlenecks, with 2% overhead. 
Download for free and get started troubleshooting in minutes. 
http://pubads.g.doubleclick.net/gampad/clk?id=48897031iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cygwin-rsyncd outdated No recent version ?

2013-07-09 Thread John Habermann
Have you looked at http://www.michaelstowe.com/backuppc/ 
I find this package that Michael made really handy when it comes to
backing up Windows clients via rsync.  

On Mon, 08 Jul 2013 00:42:09 -0700
infosupport backuppc-fo...@backupcentral.com wrote:

 The files for cygwin-rsyncd are seven years old.
 
 The BuildDate of cygwin1.dll is 2006-07-23
 The cygrunsrv.exe version doesn't work on recent OS
 
 On readme :
 
 To build the application for cygwin, fetch the source from
 http://rsync.samba.org (also available with this distribution
 on the BackupPC SourceForge site) and the cygwin-rsync-2.6.8_0.diff
 patch from http://sourceforge.net/projects/backuppc.  Put them in
 the same directory:
 
 tar zxvf rsync-2.6.8.tar.gz
 cd rsync-2.6.8
 ./configure --with-included-popt
 make
 strip rsync.exe
 make install
 
 But i can't see cygwin-rsync-2.6.8_0.diff file on backuppc Files
 
 Is there any new package for cygwin-rsyncd ??
 
 Thanks
 
 +--
 |This was sent by infosupport.radi...@orange.fr via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--
 
 
 
 --
 This SF.net email is sponsored by Windows:
 
 Build for Windows Store.
 
 http://p.sf.net/sfu/windows-dev2dev
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
See everything from the browser to the database with AppDynamics
Get end-to-end visibility with application monitoring from AppDynamics
Isolate bottlenecks and diagnose root cause in seconds.
Start your free trial of AppDynamics Pro today!
http://pubads.g.doubleclick.net/gampad/clk?id=48808831iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] cygwin-rsyncd outdated No recent version ?

2013-07-09 Thread John Habermann
I would be interested in seeing how you do this as well so thank you

On Tue, 9 Jul 2013 15:51:37 -0600
Ray Frush ray.fr...@avagotech.com wrote:

 I have added this task to my list of things to do.  I'll try to get
 something posted to this list within a week that other folks can use.
 
 
 On Tue, Jul 9, 2013 at 1:21 PM, Richard Zimmerman 
 rzimmer...@riverbendhose.com wrote:
 
  Yes, there is interest from this corner of the world…
 
  ** **
 
  Many thanks,
 
 
 
 
 


--
See everything from the browser to the database with AppDynamics
Get end-to-end visibility with application monitoring from AppDynamics
Isolate bottlenecks and diagnose root cause in seconds.
Start your free trial of AppDynamics Pro today!
http://pubads.g.doubleclick.net/gampad/clk?id=48808831iu=/4140/ostg.clktrk
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backups, rsync and the exchange mailbox store

2013-04-23 Thread John Habermann
Hi Adam

On Tue, 23 Apr 2013 01:18:07 +1000
Adam Goryachev mailingli...@websitemanagers.com.au wrote:

 On 22/04/13 06:46, John Habermann wrote:
  Hi 
  I have an exchange 2003 server which I have set up in backuppc using
  Michael's backuppc rsync based client scripts
  http://www.michaelstowe.com/backuppc/ and it appears to be working
  fine with the exception that the mail store .edb and .stm files are
  only being backed up when a Full Backup Job runs. 
 
  I am assuming this is because the date modified file size of these
  files is not changing as I have about 15 GB or so of space
  available in in the priv1 mailbox store so its size is not
  increasing with current day to day usage. I have checksum caching
  enabled so was wondering if disabling that would result in changes
  in these files getting detected. 
  Does anyone know if it is possible to get rsync to backup these sort
  files in an incremental backup or do I need to set this server to
  have a full backup run every time (or create a specific backuppc
  client alias just for the exchange store that runs full backups
  only)?
 Add --checksum to your RsyncArgs which tells rsync to compare using
 the checksum instead of modification time and file size.

I have --checksum-seed=32761 add to my list of rsync arguments:
Sending args: --server --sender --numeric-ids --perms --owner --group
-D --links --hard-links --times --block-size=2048 --recursive
--checksum-seed=32761 . . Checksum caching enabled (checksumSeed =
32761)

Do you need to set the --checksum option as well as --checksum-seed ?

Thank you


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backups, rsync and the exchange mailbox store

2013-04-23 Thread John Habermann
Thanks for your suggestion Igor I wasn't aware of the unxutils
software, it looks quite handy. 

On Mon, 22 Apr 2013 17:25:27 +0200
Igor Sverkos igor.sver...@googlemail.com wrote:

 
 Hi,
 
 another option:
 
 Grab a copy of touch win32 and write a script which will touch the
 needed files. Execute it before backuppc backups (can be done with
 backuppc).
 
 

--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Incremental backups, rsync and the exchange mailbox store

2013-04-21 Thread John Habermann
Hi 
I have an exchange 2003 server which I have set up in backuppc using
Michael's backuppc rsync based client scripts
http://www.michaelstowe.com/backuppc/ and it appears to be working fine
with the exception that the mail store .edb and .stm files are only
being backed up when a Full Backup Job runs. 

I am assuming this is because the date modified file size of these
files is not changing as I have about 15 GB or so of space available in
in the priv1 mailbox store so its size is not increasing with current
day to day usage. I have checksum caching enabled so was wondering
if disabling that would result in changes in these files getting
detected. 
Does anyone know if it is possible to get rsync to backup these sort
files in an incremental backup or do I need to set this server to have
a full backup run every time (or create a specific backuppc client
alias just for the exchange store that runs full backups only)?

Thank you

-- 
John Habermann
IT Officer
Cook Shire Council

--
Precog is a next-generation analytics platform capable of advanced
analytics on semi-structured data. The platform includes APIs for building
apps and a phenomenal toolset for data science. Developers can use
our toolset for easy data analysis  visualization. Get a free account!
http://www2.precog.com/precogplatform/slashdotnewsletter
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup Data Transfer Speed

2013-04-07 Thread John Habermann
I have one client that shows a speed of 51.24 MB/s
for its last full backup but this is definitely not a indication of the
wire speed of the link between it and the server as that is a PTP
wireless link with a max speed of 150 mb/s.
The client is linux file server with a i5 processor 4 GB of ram and a 4
disk raid 10 array using 1.5 TB RE4 drives. The backup server is a
Xeon X3320 2.5 Ghz processor with 4 GB ram and a 6 disk Raid 10 array
largely consisting of RE4 2 TB disks with a the last remaining green 2
TB disk just failing on me on Friday actually so have to go and swap
that out with a RE4 drive today. 

I think the key thing with why the last full got that speed was that
out of a 1.3 TB back up there where only 588 new files (1.6 GB in
total) so most of the performance will come down to the speed with
which the client and the server can do the calculations to determine
the changes on the client and in this is limited more by cpu, memory
and disk speed. I have another file server in the same room as the
backuppc server so has a GB link to the backup server and this
has hit 66 MB/s for a full backup but this is on a 4 disk Raid5 array
with SCSII 15k rpm disks and the file server has a XEON cpu. The full
backup speed again though varies from 18 MB/s to 66 MB/s on the amount
of new files in the backup so again with rsync based backups the
network speed is not a factor.

On Sun, 07 Apr 2013 18:33:39 -0400
Phil K. phillip.kenn...@yankeeairmuseum.org wrote:

 Another consideration is file system. Once you get beyond basic
 hardware, there are a number of tweaks to look at on the software
 side.
 
 Your setup isn't terrible, but there's a lot of room for improvement.
 As mentioned upstream, RAM would help. It wont be a cure all, but
 it's a start. WD Cav Greens are insanely slow. Black or Red are a big
 step in the right direction, but ideally I'd get that pool into a
 RAID 1 (again as mentioned by others.)
 
 Last thing to consider; Just because the switch can move data at a
 gigabit per second doesn't mean that it's going to be getting a gig
 of data per second from the host, nor is the server going to be able
 to write a gig of data per second. BackupPC isn't always moving data,
 especially with Rsync. There's a great deal of time spent listing
 directory and partition contents, hashing to compare existing data to
 potentially new. The best speeds I've personally seen are in the
 25-30 Mbps range.
 
 Bottom line, I'd bet you're getting 4 or 6 mbps. There's some room
 for you to improve, but you're not going to see backups in the 50-60
 Mbps range.
 
 Gary Roach gary719_li...@verizon.net wrote:
 
 Hi,
 
 I am modifying my whole backup system for greater volume and speed.
 The
 
 new system is as follows:
 
  Software - Backuppc using rsync (without SSH)
  Server -D865PERL Motherboard
  Pentium 4 - 2.4 GHz Processor
  IDE System hard drive
 SATA, 1 TB WD green backup data storage disk.
  (S/B 
 1.5 Gb/S)
  1GB system memory
  PCI bus
  Intel PWLA8391GT PRO/1000 GT PCI Network Adapter
  Cat 6 cable
 
  Clients (2)
 Intel DP55KG   motherboard
  Intel i5-750 Processor 4 core, 2.66 GHz
  Hard Drive - Western Digital WD5000AADS -500GB,
  32MB 
 Cache, SATAII (3 Gb/s)
  PCIX bus
  2 GB system memory
  Intel Gigabit CT PCI-E Network Adapter EXPI9301CTBLK
  Cat 6 Cable
 
 I have read all sorts of complaints about this type setup and need
 some
 
 advice. It would appear that I should get around 1 Gbit/S transfer
 rate
 
 for continuous data transfer. If this is not true, why not and what
 can
 
 I realistically expect.
 
 Gary R
 
 
 
 
 
 
 
 --
 Minimize network downtime and maximize team effectiveness.
 Reduce network management and security costs.Learn how to hire 
 the most talented Cisco Certified professionals. Visit the 
 Employer Resources Portal
 http://www.cisco.com/web/learning/employer_resources/index.html
 
 
 
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/
 


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Minimize network downtime and maximize team effectiveness.
Reduce network management and security costs.Learn how to hire 
the most talented Cisco Certified professionals. Visit

Re: [BackupPC-users] multi platform backup system

2013-02-27 Thread John Habermann
I did something similar a number of years ago comparing bacula and
backuppc and found that backuppc was much easier to set up. You don't
need to install any client software on the windows machines as
you can just make file shares availabel to the backuppc server.  I do
recommend that you look at http://www.michaelstowe.com/backuppc/ if you
need a good way of backing up windows clients via rsync and including
any open files on the windows PC. While BackupPC works very well for
backing up the data on Windows computers but it is not able to do full
metal restores of a windows machine as far as I am aware. Installing it
on your ubuntu server is easy as it is available through apt-get. I
would recommend mounting /var/lib/backuppc on a separate partition or
lvm volume though as it makes it lot easier if you need to add more disk
space or move it to another machine. 

On Wed, 27 Feb 2013 11:26:58 -0800
stuckeyneila1982 backuppc-fo...@backupcentral.com wrote:

 I have an small buisness and looking for a easy multi platform backup
 system.  I know how to get around the cli on windows and
 ubuntu/debian.  I have less than 12 computers all running win 7 pro
 except for the raid 1 file server running ubuntu 12.04 LTS server. Im
 looking for a solution to backup all pc's to the server. I have tried
 getting Amanda to work it was a no go.  Tried Bacula and couldnt get
 that to work either.  was thinking about rsync and delta copy.  But i
 would like to make snapshots of all pc's if possible.  I guess if i
 have to i could run clonezilla or ddrescue on each computer but would
 prefer to do a daily snapshot of all PC in office.  I cant seem to
 get it in my crews head to not surf too much the internet and get
 malware.  I am willing to pay for a solution if its worth it.
 
 +--
 |This was sent by stuckeyneila1...@gmail.com via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--
 
 
 
 --
 Everyone hates slow websites. So do we.
 Make your web apps faster with AppDynamics
 Download AppDynamics Lite for free today:
 http://p.sf.net/sfu/appdyn_d2d_feb
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Everyone hates slow websites. So do we.
Make your web apps faster with AppDynamics
Download AppDynamics Lite for free today:
http://p.sf.net/sfu/appdyn_d2d_feb
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] No more bakup after last partial backup?

2012-12-06 Thread John Habermann
Hi Matias

I haven't had any issues with declare myself running the script on
centos 5. Looking at the script I see it suggests using typeset if
you have problems with declare so that might help. I su to the backuppc
user to run this script:

su -s /bin/bash backuppc

cheers

On Wed, 05 Dec 2012 04:41:32 -0800
arcticeye backuppc-fo...@backupcentral.com wrote:

 Hi John,
 Thanks for your answer. I have copied the script but sh and bash
 (either) don't recognize the declare command. Is there a way to do
 it manually? Should I try to remove the last partial backup manually
 from /lib/var/backuppc/pc/server/ ?? or should I have to remove any
 kind of extra registry or something? Thank you again!
 
 Kind regards,
 
 Matias
 
 +--
 |This was sent by arctic...@gmail.com via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--
 
 
 
 --
 LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
 Remotely access PCs and mobile devices and provide instant support
 Improve your efficiency, and focus on delivering more value-add
 services Discover what IT Professionals Know. Rescue delivers
 http://p.sf.net/sfu/logmein_12329d2d
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] No more bakup after last partial backup?

2012-12-04 Thread John Habermann


On Tue, 04 Dec 2012 04:27:14 -0800
arcticeye backuppc-fo...@backupcentral.com wrote:

 Hi everyone,
 
 Before anything, thanks for allowing me to suscribe and access to
 this beautiful comunity. I have a Debian 6 server with Samba. There
 are all user files under 3 directories shared using SMB. I have 2
 bacuppc servers which every night make backups of these shared
 directories. Last week I had to reinstall the Linux server because of
 a fail in the hard disk. Once I did, backuppc tried to do backup of
 the files. The behavior of the 2 servers is estrange. The 2 of them
 did a last partial backup but took a lot of time and never reach to
 complete a full or incremental. Since then each time i try to maually
 do backup they can be like 24 hours and never end doing the backup.
 No error log... so that seems very strange but backups are not
 finally done. I could delete the host from backuppc and re create it
 and then reconfigure.. but last time i did it all older backups and
 its incrementals were also automatically deleted.
 
 Could you help me on this in some way? Do you need some extra
 information? Thank you very much for you time

I have had issues like this before and have resolved them by removing
the partial backup using the script from
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=How_to_delete_backups

Copy the script to your local bin directory and then run it like so:
/usr/local/bin/remove_old_backups.sh -c server_name -d 326 (or whatever
the number of your partial backup is)

Once the partial backup has been deleted try running a full backup
again and see if that completes successfully.

 
 Kind regards,
 
 Matias
 
 +--
 |This was sent by arctic...@gmail.com via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--
 
 
 
 --
 LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
 Remotely access PCs and mobile devices and provide instant support
 Improve your efficiency, and focus on delivering more value-add
 services Discover what IT Professionals Know. Rescue delivers
 http://p.sf.net/sfu/logmein_12329d2d
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
LogMeIn Rescue: Anywhere, Anytime Remote support for IT. Free Trial
Remotely access PCs and mobile devices and provide instant support
Improve your efficiency, and focus on delivering more value-add services
Discover what IT Professionals Know. Rescue delivers
http://p.sf.net/sfu/logmein_12329d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Performance reference (linux --(rsync)- linux)

2012-11-06 Thread John Habermann
Hi Cassiano

On Tue, 6 Nov 2012 10:03:44 +
Cassiano Surek c...@surek.co.uk wrote:

 Of course, how could I have missed that! I did find it now, thanks
 Michał.
 
 Last full backup (of 100 odd Gb) took slightly north of 10 days to
 complete. Incremental, just over 5 days.

That seems rather slow. I have BackupPC 3.2.1
runing on Centos 5.8 (rsync 3.0.6) that is backing up 18 hosts most of
which are backed up over a 300 mbps wireless link. The largest of these
hosts is a linux file server (Ubuntu 10.04 rsync 3.0.7) with about 1 TB
of files that are backed up. The last full backup for this server had a
speed of 22.28 MB/s and a duration of 772 minutes new files was 61 GB
in size probably about 60 GB of this would have been a single exchange
ntbackup file with about 1.3 GB worth of other files from the various
user shares. As a result of the large ntbackup files the incrementals
can take nearly as long as a full backup. The last one for example had a
duration of 757 minutes speed 1.58 MB/s and new files size of 71 GB

The specs for the Backup Server are:
2.5 Ghz Xeon X3320, 4 GB Ram, file system is ext3
with /var/lib/BackupPC on its own partition a 6 TB lvm volume made up
of 3 2 TB raid1 pairs. rsync 3.0.6

The specs for the client are:
2.93 Ghz Intel i3, 4 GB ram, filesystem ext3 with the main data
partition being a 4 disk 3 TB raid10 partition.

Have you tried doing a normal rsync or even just a scp copy of the data
from the client to the backup and seen how long that takes? Perhaps
upgrading the rsync on the client which I see you said is 2.6.3 might
help.

cheers

 
 On 6 Nov 2012, at 09:58, Michał Sawicz wrote:
 
  W dniu 06.11.2012 10:43, Cassiano Surek pisze:
  That indeed makes sense.  On the Host Summary screen I cannot find
  that info (how long the backup took, e.g. (end-start) date/time.
  Is there a place where these get saved or should I just use my own
  stop watch? :)
  
  For each backup, on the host page, there's a elapsed time column
  with time in minutes it took to make the backup.
  
  -- 
  Michał (Saviq) Sawicz mic...@sawicz.net
 
 
 --
 LogMeIn Central: Instant, anywhere, Remote PC access and management.
 Stay in control, update software, and manage PCs from one command
 center Diagnose problems and improve visibility into emerging IT
 issues Automate, monitor and manage. Do more in less time with Central
 http://p.sf.net/sfu/logmein12331_d2d
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
LogMeIn Central: Instant, anywhere, Remote PC access and management.
Stay in control, update software, and manage PCs from one command center
Diagnose problems and improve visibility into emerging IT issues
Automate, monitor and manage. Do more in less time with Central
http://p.sf.net/sfu/logmein12331_d2d
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] How to submit changes

2012-05-22 Thread John Habermann
Hi Brad

This is something that I would certainly find useful so +1 for it being
added to BackupPC. 

cheers
John

On Tue, 22 May 2012 15:48:46 -0600
Brad Morgan b-mor...@concentric.net wrote:

 I've added a BlackoutFulls config option to my local copy of
 BackupPC. This functions in a similar fashion to BlackoutPeriods but
 is only checked when BackupPC has determined that a full backup needs
 to be done (and was not requested by the user). I've configured my
 system so that incrementals are blacked out during day on the
 weekdays (Mon-Fri) and fulls are blacked out except on the weekends. 
 
  
 
 I have not completed the localization (string is added to all files,
 but not translated from english) and I'm not sure how to add
 documentation so I could use some help.
 
  
 
 If this is something that the community would like added to BackupPC,
 I need to know how to submit my changes.
 
  
 
 Regards,
 
  
 
 Brad
 


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] trying to improve the speed at which backuppc rsync back up processes a large binary file in incremental backups

2012-01-10 Thread John Habermann
Thank you for your reply Les

On Sun, 8 Jan 2012 16:00:02 -0600
Les Mikesell lesmikes...@gmail.com wrote:

 On Sun, Jan 8, 2012 at 4:48 AM, John Habermann
 jhaberm...@cook.qld.gov.au wrote:
 
  You can see that the backup of the /opt share takes nearly the total
  time of the incremental taking about 8 and half hours to complete
  while the backup of the /opt rsync share in the full backup takes
  about 3 and half hours. The full backup is slightly longer than
  what it takes if I just do a rsync over ssh copy of the file from
  the client server to the backup server.
 
  I have found that rsync seems to always transfer the whole file when
  copying this file from the client server to the backup server:
 
  # rsync -avzh --progress -e ssh
  administrator@isabella:ExchangeDailyBackup.bkf
  ExchangeDailyBackup.bkf Password:
  receiving incremental file list
  ExchangeDailyBackup.bkf
       54.44G 100%   10.66MB/s    1:21:10 (xfer#1, to-check=0/1)4
 
  sent 3.31M bytes  received 3.27G bytes  486.33K bytes/sec
  total size is 54.44G  speedup is 16.65.
 
 Note that you have used the -z option with native rsync, which
 backuppc doesn't support.  You can add the -C option to ssh to get
 compression at that layer when you run rsync over ssh, though.

I will try that. I imagine it won't make much of a difference with the
ntbackup file if that is already compressed but it will be interesting
to see how it affects the transfer of other data. 

 
  My questions for the list are:
  1. Is it reasonable for rsync to transfer the whole file when
  copying a large ntbackup file?
 
 Yes, those files may have little or nothing in common with the
 previous copy.   If compression or encryption are used they will
 ensure that no blocks match and even if they aren't, the common blocks
 may be skewed enough that rsync can't match them up.

Ok I wonder if it might be better for me to look at having that file
backed up then by tar rather than rsync. I can't see any mentions of
using multiple transfer methods within a single client config in my
searching. Is that an option or is the only way to do this is to create
a second client.pc file that uses tar as the transfer method and just
backups up the ntbackup share?

 
  2. Why does an incremental backup of this file take so much longer
  than a full backup of it or a plain rsync of this file?
 
 That doesn't make sense to me either.  Are you sure that is consistent
 and not related to something else that might have been using the link
 concurrently?
 

It appears pretty consistent that the incremental backups of this
client are always longer than the full backups but there is a fair
amount of variability in the time the incremental backup takes. I am
going to set up a separate backuppc server on the same local area
network as the client and see how the full compares to the incremental
when the bandwidth of the connection is not the limiting factor.


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual 
desktops for less than the cost of PCs and save 60% on VDI infrastructure 
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] trying to improve the speed at which backuppc rsync back up processes a large binary file in incremental backups

2012-01-10 Thread John Habermann
Hi Pedro

On Wed, 11 Jan 2012 00:53:33 +
Pedro M. S. Oliveira pmsolive...@gmail.com wrote:

 Are you sure that the wireless connection is capable of 11 MB, isn't
 it 11Mb?

No it is 11MB, the connection is a 5.8 Ghz point to point wireless link
which according to the manufacturers has a 150 Mbps capacity. 

 
 Sent from my galaxy tab 10.1.
 On Jan 8, 2012 10:05 PM, Les Mikesell lesmikes...@gmail.com wrote:
 
  On Sun, Jan 8, 2012 at 4:48 AM, John Habermann
  jhaberm...@cook.qld.gov.au wrote:
  
   You can see that the backup of the /opt share takes nearly the
   total time of the incremental taking about 8 and half hours to
   complete while the backup of the /opt rsync share in the full
   backup takes about 3 and half hours. The full backup is slightly
   longer than what it takes if I just do a rsync over ssh copy of
   the file from the client server to the backup server.
  
   I have found that rsync seems to always transfer the whole file
   when copying this file from the client server to the backup
   server:
  
   # rsync -avzh --progress -e ssh
   administrator@isabella:ExchangeDailyBackup.bkf
   ExchangeDailyBackup.bkf Password:
   receiving incremental file list
   ExchangeDailyBackup.bkf
54.44G 100%   10.66MB/s1:21:10 (xfer#1, to-check=0/1)4
  
   sent 3.31M bytes  received 3.27G bytes  486.33K bytes/sec
   total size is 54.44G  speedup is 16.65.
 
  Note that you have used the -z option with native rsync, which
  backuppc doesn't support.  You can add the -C option to ssh to get
  compression at that layer when you run rsync over ssh, though.
 
   My questions for the list are:
   1. Is it reasonable for rsync to transfer the whole file when
   copying a large ntbackup file?
 
  Yes, those files may have little or nothing in common with the
  previous copy.   If compression or encryption are used they will
  ensure that no blocks match and even if they aren't, the common
  blocks may be skewed enough that rsync can't match them up.
 
   2. Why does an incremental backup of this file take so much
   longer than a full backup of it or a plain rsync of this file?
 
  That doesn't make sense to me either.  Are you sure that is
  consistent and not related to something else that might have been
  using the link concurrently?
 
  --
Les Mikesell
  lesmikes...@gmail.com
 
 
  --
  Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a
  complex infrastructure or vast IT resources to deliver seamless,
  secure access to virtual desktops. With this all-in-one solution,
  easily deploy virtual desktops for less than the cost of PCs and
  save 60% on VDI infrastructure costs. Try it free!
  http://p.sf.net/sfu/Citrix-VDIinabox
  ___ BackupPC-users
  mailing list BackupPC-users@lists.sourceforge.net
  List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
  Wiki:http://backuppc.wiki.sourceforge.net
  Project: http://backuppc.sourceforge.net/
 


-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual 
desktops for less than the cost of PCs and save 60% on VDI infrastructure 
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] trying to improve the speed at which backuppc rsync back up processes a large binary file in incremental backups

2012-01-08 Thread John Habermann
Hi 

I have a problem with the rsync incremental backups of a 50 GB binary
file. My set up consists of:
- Backuppc 3.2.1 running on Centos 5.7 with rsync 3.0.6. The server has
  2 GB of RAM with a single Xeon processor
- Client being backed up is Ubuntu Lucid server 
- Backup is done over a wireless link between 2 buildings which can
  provide for a 11 MB/s when tested with a plain rsync of a file
  between the 2 servers.
- The client receives a nightly copy of a ntbackup of a exchange 2003
  database to the /opt/samba/ntbackups folder. The file is
  currently around 51 GB in size

The problem I am having is that the incremental backups take around
double the time the full backups. These are the backuppc logs for full
backup following by a incremental on the next night:

Previous nights full:
2012-01-06 14:47:42 full backup started for directory / (baseline
backup #377)
2012-01-06 14:47:44 full backup started for directory /etc (baseline
backup #377)
2012-01-06 14:47:57 full backup started for directory /var (baseline
backup #377)
2012-01-06 14:50:49 full backup started for directory /opt (baseline
backup #377)
2012-01-06 18:38:38 full backup started for directory /root (baseline
backup #377)
2012-01-06 18:38:42 full backup started for directory /home (baseline
backup #377)
2012-01-06 19:02:59 full backup started for directory /usr (baseline
backup #377)
2012-01-06 19:07:07 full backup started for directory /bin (baseline
backup #377)
2012-01-06 19:07:10 full backup started for directory /sbin (baseline
backup #377)
2012-01-06 19:07:12 full backup started for directory /lib (baseline
backup #377)
2012-01-06 19:08:32 full backup 378 complete, 202077 files,
641375636764 bytes, 1 xferErrs (0 bad files, 0 bad shares, 1 other)

Last nights incremental
2012-01-07 15:00:10 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /
2012-01-07 15:00:14 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /etc
2012-01-07 15:00:21 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /var
2012-01-07 15:03:39 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /opt
2012-01-07 23:56:40 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /root
2012-01-07 23:56:45 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /home
2012-01-07 23:57:33 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /usr
2012-01-07 23:59:42 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /bin
2012-01-07 23:59:43 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /sbin
2012-01-07 23:59:45 incr backup started back to 2012-01-06 14:47:42
(backup #378) for directory /lib
2012-01-08 00:00:26 incr backup 379 complete, 126 files, 55831250555
bytes, 0 xferErrs (0 bad files, 0 bad shares, 0 other)

You can see that the backup of the /opt share takes nearly the total
time of the incremental taking about 8 and half hours to complete while
the backup of the /opt rsync share in the full backup takes about 3 and
half hours. The full backup is slightly longer than what it takes if I
just do a rsync over ssh copy of the file from the client server to the
backup server. 

I have found that rsync seems to always transfer the whole file when
copying this file from the client server to the backup server:

# rsync -avzh --progress -e ssh
administrator@isabella:ExchangeDailyBackup.bkf ExchangeDailyBackup.bkf
Password:
receiving incremental file list
ExchangeDailyBackup.bkf
  54.44G 100%   10.66MB/s1:21:10 (xfer#1, to-check=0/1)4

sent 3.31M bytes  received 3.27G bytes  486.33K bytes/sec
total size is 54.44G  speedup is 16.65.

My questions for the list are:
1. Is it reasonable for rsync to transfer the whole file when copying a
large ntbackup file? 
2. Why does an incremental backup of this file take so much longer than
a full backup of it or a plain rsync of this file?

Thank you

-- 
John Habermann
IT Officer
Cook Shire Council
10 Furneaux St
Cooktown 4895
ph 40820577

--
Ridiculously easy VDI. With Citrix VDI-in-a-Box, you don't need a complex
infrastructure or vast IT resources to deliver seamless, secure access to
virtual desktops. With this all-in-one solution, easily deploy virtual 
desktops for less than the cost of PCs and save 60% on VDI infrastructure 
costs. Try it free! http://p.sf.net/sfu/Citrix-VDIinabox
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/