[BackupPC-users] Restore from version 3.x disk in 4.1.3 server

2017-11-26 Thread Erik Hjertén
Hello

I have been running BackupPC for many years backing up my private and
small business files. It has "saved the day" on a number of occasions
but now I need some help. Here's the situation:

After a lightning strike, one disk in the server X was destroyed. At
the same time, the BackupPC server suffered hardware failures. Disks
are runnable, but contains errors. I have run ddrescue on the disk
containing the backup files and it reported 99% rescued data.  This
backup server was running BackupPC version 3.x. I did set up a new
backup server, now running version 4.1.3 with new disks, backing up a
small number of machines. Its been running fine for a couple of weeks.

Now I would like to restore the files from the version 3.x disk to the
server X using my new backup server that runs BackupPC 4.1.3. How do I
go about accessing the backup files from this disk in BackupPC?

If I remember correctly I changed the "$Conf{TopDir}" setting to place
the backup files to my separate disk in the old backup server. I also
have access to the old system disk from the old backup server if that
helps. 

Any pointers greatly appreciated as this data is critical.

Kind regards
/Erik


--
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] daemon not starting on Ubuntu start-up

2014-02-20 Thread Erik Hjertén


2014-02-20 13:00, Adam Hardy skrev:
Every time I reboot my Ubuntu server, I have to restart the backuppc 
manually.


I figured this must be an Ubuntu packaging problem, something to do 
with the sysvinit config, but I've checked those out and from my 
limited knowledge of sysvinit, backuppc appears to have good settings 
in the appropriate run-levels:


0 - -
1 - K20
2 - S20
3 - S20
4 - S20
5 - S20
6 - -

S20 seems a bit early though. Apache for instance has S91.

However it could also be an error in the start-up script rather than 
it not executing at all - for instance I have the backups on a 
removable USB harddrive, 
I had similar problems earlier when I was storing backups on an external 
NAS. As the boot process was trying to start BackpPC before the NAS was 
mounted, it failed. I dont have that setup anymore so I can't help you 
with a solution but perhaps its a starting point to investigate.


Regards
/Erik




--
Managing the Performance of Cloud-Based Applications
Take advantage of what the Cloud has to offer - Avoid Common Pitfalls.
Read the Whitepaper.
http://pubads.g.doubleclick.net/gampad/clk?id=121054471iu=/4140/ostg.clktrk___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New backup server

2013-05-24 Thread Erik Hjertén

  
  
Thanks for your answer.
  
  According to HPs web (using the serial number) I have the SC40GE
  controller.
  
  Ray Frush skrev 2013-05-24 18:50:

I have found that
the limiting speed for backups with lots of files is how fast
the backup server can walk the BackupPC pool, so going with
faster disks will help some.   You will probably find that 4x
500GB drives in a RAID 10 will give you more throughput, but at
more cost.
  


  Your ML150 has 8 slots,
  so you've got a lot of options.  Do you know which controller
  you have?


  

  
  On Fri, May 24, 2013 at 10:15 AM,
Erik Hjertén erik.hjer...@companion.se
wrote:

   Hi all

I have invested in a used HP Proliant ML150 G5 server as
a new backup server. I have about 500 GB of data in 40
000 files spread over 8 clients to backup. Data doesn't
grow fast so I'm aiming at two 1TB disks in a raid 1
configuration. 

Do I go with more expensive, but faster (and more
reliable?), SAS-disks. Or is cheaper, but slower, S-ATA
disks sufficient? I'm guessing that disk speed will be
the bottle neck in performance?

Your thoughts on this would be appreciated. 
 /Erik



  

  




  

  

  
  
--
  Try New Relic Now  We'll Send You this Cool Shirt
  New Relic is the only SaaS-based application performance
  monitoring service
  that delivers powerful full stack analytics. Optimize and
  monitor your
  browser, app,  servers with just a few lines of code.
  Try New Relic
  and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may
  ___
  BackupPC-users mailing list
  BackupPC-users@lists.sourceforge.net
  List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
  Wiki:    http://backuppc.wiki.sourceforge.net
  Project: http://backuppc.sourceforge.net/
  

  
  
  
  
  
  -- 
  Ray Frush               "Either you are part of the solution
  T:970.288.6223               or part of the precipitate."
-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-
   Avago Technologies, Inc. | Technical Computing | IT Engineer

  
  
  
  
  --
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may
  
  
  
  ___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/




-- 
  
  

  
  
     Erik Hjertén
  erik.hjer...@companion.se
  0708-90 55 30
  
  Näsvägen 40, 139 33 Värmdö
  08-403 750 50
  www.companion.se

  

  

  

--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New backup server

2013-05-24 Thread Erik Hjertén

  
  
Thanks Les.
  
  The majority of the files are actually on our main server so I'm
  worrying a little bit.
  
  Les Mikesell skrev 2013-05-24 19:39:


  On Fri, May 24, 2013 at 11:15 AM, Erik Hjertén
erik.hjer...@companion.se wrote:

  

Hi all

I have invested in a used HP Proliant ML150 G5 server as a new backup server. I have about 500 GB of data in 40 000 files spread over 8 clients to backup. Data doesn't grow fast so I'm aiming at two 1TB disks in a raid 1 configuration.

Do I go with more expensive, but faster (and more reliable?), SAS-disks. Or is cheaper, but slower, S-ATA disks sufficient? I'm guessing that disk speed will be the bottle neck in performance?

Your thoughts on this would be appreciated.

  
  
I wouldn't expect that scale to be a problem for sata - just be sure
they are suitable for RAID use (the 'green' kind that spin down when
idle can cause problems).If the 40,000 files are mostly in one
filesystem you might need to worry about speed, but not if they are
distributed and you can skew the days when full runs happen.

--
   Les Mikesell
 lesmikes...@gmail.com

--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/




-- 
  
  

  
  
 Erik Hjertén
  erik.hjer...@companion.se
  0708-90 55 30
  
  Näsvägen 40, 139 33 Värmdö
  08-403 750 50
  www.companion.se

  

  

  

--
Try New Relic Now  We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app,  servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_may___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] New backup server

2013-05-24 Thread Erik Hjertén

  
  
Thanks for your thorough reply Timothy.
  
  I will look into the DVR type drives, that's a good point
  
  And I read you, loud and clear, that there are a number of
  questions to be answered before one can say that a certain
  configuration will suffice in any given situation. Your reasoning
  will help me here.
  
  About 8000 files a photos between 5 and 15 MB each, in total
  around 100 GB. This will perhaps explain the slightly
  size-per-file numbers. There are a also few ISO-files around 7 GB,
  rest is smaller files.
  
  And as you guess, my setup is straightforward, no surprises among
  your questions below.
  
  I think now SATA drives will do the job. I'll look into costs on
  different drives.
   
  Timothy J Massey skrev 2013-05-24 20:27:

Erik Hjertén
  erik.hjer...@companion.se wrote
  on 05/24/2013 12:15:22 PM:
  
   Hi all
   
   I have invested in a used HP Proliant ML150 G5 server as
  a new 
   backup server. I have about 500 GB of data in 40 000
  files spread
  
   over 8 clients to backup. Data doesn't grow fast so I'm
  aiming at
  
   two 1TB disks in a raid 1 configuration. 
   
   Do I go with more expensive, but faster (and more
  reliable?), SAS-
   disks. Or is cheaper, but slower, S-ATA disks sufficient?
  I'm 
   guessing that disk speed will be the bottle neck in
  performance?
   
   Your thoughts on this would be appreciated. 

  
  What is your backup window?  12 hours?  You
  could do that with a *single* 7200 RPM SATA drive.  8 hours?
   Probably
  still, but you'd have to do some testing to see.  Less than
  that?
   You're going to need to intimately undertand both your
  circumstances
  and the various technologies inside of BackupPC to be able to
  answer that
  better.
  
  
  Frankly, a mirrored array isn't gonna buy you
  much
  performance increase.  It won't help write performance at
  *all*, and
  I'm not sure you'll need enough read performance to matter:
   the high
  amount of seeking that BackupPC requires doesn't really hep
  for getting
  sustained high read performance.
  
  
  I will take Les' advice (don't use the Green
  drives)
  one step farther:  I recommend the drives designed for
  DVR/Video use.
   "Normal" drives (the not-Green drives) are warrantied only
  for 8x5 usage;  the DVR drives are rated for 24x7 usage.
   They're
  a little more expensive, but not much.
  
  
  There are other questions you will need to ask
  that
  will make as much (if not more) difference than the speed of
  the drives
  you'll be using:
  
  
  * Would more drives (even if they're slower)
  give
  you better performance?
  
  
  * How fast can the *clients* push the data?  If
  you're limited by them, improving the server won't help?
  
  
  * What is the speed of the network involved?
   Are
  you talking 100Mb/s or slower?  That will severely limit your
  performance.
   Do you Gigabit everywhere between them?  Are there points in
  between that might cause problems (like if the clients and
  server are in
  *different* switches)?  Can you do bonded Ethernet on the
  server?
  
  
  * What technique are you using to back up the
  files?
   Rsync over ssh (with encryption overhead)?  Rsyncd?  tar/SMB
  (which are much less intelligent in transferring files, but
  maybe less
  disk-intensive)?  Will you use compression on the server, and
  what
  level of CPU do you have?
  
  
  * What do your files look like?  40,000 files
  for 500GB of data is a pretty high size-per-file.  (Contrast
  with
  one of my servers, which is 800GB, but 400,000 files: 1/5 the
  data per
  file.)  Are your files mostly small (say, under 10kB), mostly
  average
  (10k to 10M)?  Do you have any massive files (1GB or larger)
  to deal
  with?  Backing up a database server and backing up a mail
  server require
  *noticeably* different approaches.
  
  
  (Believe it or not, even all of *this* is not
  all
  that's involved!)
  
  
  
  Your question is along the lines of "Which is
  faster, a bicycle or a dump truck?"  The answer is:  it
  depends.  Need to move a

Re: [BackupPC-users] Netgear readyNAS to backuppc

2011-11-10 Thread Erik Hjertén
On 2011-11-10 14:18, draccusfly wrote:
 Hi

 I am running backuppc on a linux box and am trying to connect my ReadyNas 
 4200 to backuppc in order to back an iSCSI data source on there.  The trouble 
 is that I cannot seem to get the readynas to connect.  I have tried al 
 permutations of username, password, host and path but still get cannot 
 connect.  just get error connecting to odyssey (our backuppc server)

 Any ideas would be appreciated
I have a Netgear Readynas Duo which I have mounted in a Ubuntu server 
which in turn runs Backuppc. In /etc/fstab I have the following entry:

192.168.1.21:/backup /mnt/backuppc nfs rsize=32768,wsize=32768,hard,intr

If i remember correctly I also had to setup a the share /backup in the 
Nas as well as a user with the right permissions.

This connects nicely and I do all my backups to the Nas. Hope this helps.
/Erik



 Drac

 +--
 |This was sent by dean.wheat...@insphire.com via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--



 --
 RSA(R) Conference 2012
 Save $700 by Nov 18
 Register now
 http://p.sf.net/sfu/rsa-sfdev2dev1
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


--
RSA(R) Conference 2012
Save $700 by Nov 18
Register now
http://p.sf.net/sfu/rsa-sfdev2dev1
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Best NAS device to run BackupPC ?

2011-05-17 Thread Erik Hjertén


  
  
lmirg...@microworld.org skrev 2011-05-17 16:23:
Hello backuppc-users,
  
  I would like to backup all the machines of my company (12 laptops,
  Windows/Mac/Linux) in a centralized way on a NAS device.
  I like a lot BackupPC and if possible I would like to use it to
  run the backups.
  
  Now comes the choice of the NAS... What NAS device would you
  recommend with a good ratio "performance / easy to install
  BackupPC on it" ?
  
  The ideal situation would be a NAS with BackupPC pre-installed -
  or a NAS with some available BackupPC packages ready to deploy.
  I looked at Synology / QNap / WD Sharespace but in each case the
  install of BackupPC seems tedious, and I'm not sure of the
  performances I will get on such devices...
  
  Thanks for your advices and your help 
  ---
  Laurent

I use the small and simple Netgear Readynas Duo with backuppc
running on my Ubuntu server. I went with this solution as I thought
it would be easy and problem free. If I was to do it today I would
have gone with a dedicated Lunux-server with backuppc on it and no
NAS at all. With a simple NAS as I have you will get tied down with
limitations on different things. (One irritation with my NAS is that
it wont power up after a power failure) Maybe a larger and more
robust NAS will have less limitations but will also cost more.

So, why the NAS? Have you considered a dedicated server instead? It
will perhaps be harder to setup but you will get all the help you
need from the Linux/Unix community and here.

Kind regards
/Erik

  

--
Achieve unprecedented app performance and reliability
What every C/C++ and Fortran developer should know.
Learn how Intel has extended the reach of its next-generation tools
to help boost performance applications - inlcuding clusters.
http://p.sf.net/sfu/intel-dev2devmay
  

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/




-- 
  
  

  
  
 Erik Hjertn
  erik.hjer...@companion.se
  0708-90 55 30
  
  Nsvgen 40, 139 33Vrmd
  08-403 750 50
  www.companion.se

  

  

  

--
Achieve unprecedented app performance and reliability
What every C/C++ and Fortran developer should know.
Learn how Intel has extended the reach of its next-generation tools
to help boost performance applications - inlcuding clusters.
http://p.sf.net/sfu/intel-dev2devmay___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] NTFS symbolic links [Solved]

2011-01-13 Thread Erik Hjertén


  
  
Just reporting back my solution for reference. 

Denis Jedig skrev 2011-01-02 00:32:

  Unfortunately, saving NTFS metadata with rsync has always been a 
pain in the ass, as the protocol simply was not designed for most 
of it. While symbolic links and junctions would fit in rather 
easily, to my best knowledge they are not implemented.

You could build some cludge using dir /S /AL (which will give you 
symbolic links as well as junctions in the output, but 
unfortunately it will come in a way which is poorly suitable for 
scripting it back into shape. Alternatively, you could write some 
powershell lines using Get-ChildItem, checking for the 
ReparsePoint attribute and using Get-ReparsePoint for retrieving 
the destination. The people from the Windows PowerShell 
newsgroups/lists should be able to help you here.



Thanks Denis for your input!

I ended up writing two small powershell scripts, getLinkInfo.ps1 and
restoreLinks.ps1 (sources below)

getLinkInfo.ps1 actually calls "dir /al /s" and therefore the script
is dependent of languge (in this case swedish) in the output from
"dir". The script takes a directory path as argument and creates two
text files, dirs.txt and links.txt. They contain the directory
structure, including all subdirectories, and all the links. I let
BackupPC backup these files.

I didn't use powershell to get the directory and link info as I
reached the end of my knowhow. I bet some one else could this
properly.

restoreLinks.ps1 reads these two text files and recreates the
directory structure and all the links. It needs admin privileges. 

I guess one could run the getLinkInfo.ps1 from BackupPC using
DumpPreUserCmd or similar. I haven't tried it though.

It's cludgy as h*ll, but it works. There is no error handling so use
this at your own risk. :)

Kind regards
/Erik

- getLinkInfo.ps1 -
clear-host
  $list = cmd /c dir $args[0] /al /s
  $newlinklist = ""
  $newdirlist = ""
  for ($row = 3; $row -le ($list.length-5); $row++) {
   if ($list[$row].split( " ")[1] -like "Inneh*ll") {
   $curdir = $list[$row].substring(22)
   $newdirlist = $newdirlist + $curdir + "`r`n"
   $curdir
   }
   if ($list[$row].split(" ")[6] -like "symlink") {
   $curfile = $list[$row].substring(36,
  $list[$row].indexof("[")-36)
   $curtarget = $list[$row].substring($list[$row].indexof("[")+1,
  $list[$row].length-$list[$row].indexof("[")-2) 
   $newlinklist = $newlinklist + $curdir + "\" + $curfile + "%" +
  $curtarget + "`r`n"
   }
  }
  $result = new-item links.txt -type file -force -value $newlinklist
  $result = new-item dirs.txt -type file -force -value $newdirlist
-

- restoreLinks.ps1 -
$linklist = get-content links.txt -encoding UTF8
  $dirlist = get-content dirs.txt -encoding UTF8
  for ($row = 0; $row -le ($dirlist.length-1); $row++) {
   md $dirlist[$row].split("%")[0]
  }
  
  for ($row = 0; $row -le ($linklist.length-1); $row++) {
   cmd /c mklink $linklist[$row].split("%")[0]
  $linklist[$row].split("%")[1]
  }
-
-- 
  
  

  
  
 Erik Hjertn
  erik.hjer...@companion.se
  0708-90 55 30
  
  Nsvgen 40, 139 33Vrmd
  08-403 750 50
  www.companion.se

  

  

  

--
Protect Your Site and Customers from Malware Attacks
Learn about various malware tactics and how to avoid them. Understand 
malware threats, the impact they can have on your business, and how you 
can protect your company and customers by using code signing.
http://p.sf.net/sfu/oracle-sfdevnl___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] NTFS symbolic links

2011-01-01 Thread Erik Hjertén


  
  
Hello

I'm using BackupPC to backup a windows Vista computer. I use
some NTFS symbolic links on it and it seems that BackupPC is
transferring the file to which the link is pointing to, rather
than just transferring the link. With this behaviour I'm backing
up some files twice. What I need is BackupPC to just transfer
the link.

I guess this is a breeze when it comes to UNIX/Linux clients,
but is it possible on Windows clients as well?

I'm using rsyncd as transfer method. BackupPC is running on an
Ubuntu server.

Any help on this would be appreciated, thanks.

Kind regards
/Erik

PS Happy new year to all! DS
  
-- 
  
  

  
  
 Erik Hjertn
  erik.hjer...@companion.se
  0708-90 55 30
  
  Nsvgen 40, 139 33Vrmd
  08-403 750 50
  www.companion.se

  

  

  

--
Learn how Oracle Real Application Clusters (RAC) One Node allows customers
to consolidate database storage, standardize their database environment, and, 
should the need arise, upgrade to a full multi-node Oracle RAC database 
without downtime or disruption
http://p.sf.net/sfu/oracle-sfdevnl___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] NTFS symbolic links

2011-01-01 Thread Erik Hjertén

Denis Jedig skrev 2011-01-01 21:37:

Am 01.01.2011 20:46, schrieb Erik Hjertén:


I'm using BackupPC to backup a windows Vista computer. I use some
NTFS symbolic links on it

What you mean is probably not symbolic links but NTFS junctions,
which behave more like UNIX hard links. I believe none of the
current rsyncd versions will support junctions. The simple
solution so far has been to exclude junctions from backup:

http://www.cs.umd.edu/~cdunne/projs/backuppc_guide.html#7Vista
Thanks Denis, but I actually mean symbolic links under NTFS create with 
e.g mklink [[/D] | [/H] | [/J]] link target se here: 
http://technet.microsoft.com/en-us/library/cc753194%28WS.10%29.aspx


Just  excluding the links is not the best solution as I really want to 
back them up, I have a few thousand of them so it's hard work to 
re-create them manually in case of horror/disaster. If I could perhaps 
somehow save the links in a file and then, in case of a needed restore, 
I could issue a command using the file? I'm at the end of my knowledge 
here...


Or will some other transfer method in BackupPC take care of things?

Kind regards
/Erik






--
Learn how Oracle Real Application Clusters (RAC) One Node allows customers
to consolidate database storage, standardize their database environment, and, 
should the need arise, upgrade to a full multi-node Oracle RAC database 
without downtime or disruption
http://p.sf.net/sfu/oracle-sfdevnl___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Outsourcing backup

2010-05-04 Thread Erik Hjertén

Les Mikesell skrev 2010-05-04 14:50:

Inno wrote:
   

I have not a decent internet connectivity and I have more than 500 GB.

 

With rsync the size doesn't matter much after you get the initial copy (which
you might take on-site and carry over or let it run over a weekend).  The
bandwidth/time you need nightly would depend on the rate of change in the 
content.
   
The size doesn't matter much - Is this true even if he runs fulls 
every week? Or is it only true for the incremental runs?

/Erik




--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] NAS

2010-04-07 Thread Erik Hjertén
eforge.net/
  


-- 


  

    
   Erik Hjertén
  erik.hjer...@companion.se
0708-90 55 30
  
Näsvägen 40, 139 33 Värmdö
08-403 750 50
  www.companion.se
  

  




--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] NAS

2010-04-07 Thread Erik Hjertén




David,

I think it depends on, as you say, the flavour. Here in sweden these
NAS:es comes with and without disks. I bought mine without disks as I
had two Western Digital Caviar Green (WD10EACS) from my previous setup.
These disks are 5400 rpm.

Regards
/Erik

David Williams skrev 2010-04-07 14:03:

  
  

  
  
  Erik,
  
  Thanks
for the feedback. Thats
helpful.
  What type
of drives come with
the NAS? Are they 7200rpm or 5400rpm? Or does that depend on which
flavour of
the Netgear NAS one buys?
  
  
  ___
  Dave
Williams
  Check out
our WebOS mobile phone
app for the Palm Pre and Pixi: 
  Golf
Caddie | Golf Caddie Forum | Golf Caddie FAQ by DTW-Consulting, Inc.
  
  
  
  
  
  From:
Erik Hjertn
[mailto:erik.hjer...@companion.se] 
  Sent: Tuesday, April 06, 2010 5:53 PM
  To: backuppc-users@lists.sourceforge.net
  Subject: Re: [BackupPC-users] NAS
  
  
  
  Hi
David
  
I have a ReadyNAS Duo [X-RAID] with two 1TB disks in raid 0 (mirrored).
I
believe it's close to what you are looking at. It's mounted as NFS in
my Ubuntu
server. I use this set-up in my home for personal as well as business
use. It
has worked flawlessly since I installed it, I think it was in october
last
year. I backup several clients to it as well as the Ubuntu server it
self. A
full backup typically reaches a speed of 8 MB/s, but I haven't made any
effort
to tune the setup, I just plug and pray ;-) . I also store some larger
files on
it and I stream video from it as well. Works great.
  
What do you want to know?
  
Regards
/Erik
  
David Williams skrev 2010-04-06 23:11: 
  I currently have a Western Digitial 1TB My Book
World device
which seems like its starting to play up, which is a shame because
Ive hacked
it (its basically a Linux server) so that I could mount the drive as
NFS under
Linux and perform backups using Backuppc. I havent look into the
problem
with the drive too much, but after a few days of it being mounted I
cant cd
into the mount partition of the drive L
  
  Im looking to get a replacement which will
mainly be used
to backup various mount points on my home Linux server. I havent done
too much research yet but need something that is one the cheap side, is
compatible with Linux and preferably will let me format the drives to
ext3 too,
or at least mount the drives such that I can perform backups using
Backuppc,
which requires hardlinks.
  
  I took a very quick look around and saw the
Netgear
RND2210-100NAS which looks pretty decent (states that its compatible
with
Linux) and has 2 drives. Looks like it already comes with 2 x 1TB
drives
(no idea what type of drives though if it does indeed come with 2 HDs)
and this
would be great as I could use 1 on the 1TB drive for my backups and the
other
to hold my photos, music and videos, which would free up some space on
my Linux
server.
  
  Has anyone ever used (or is currently using) the
above
Netgear device and could provide feedback?
  
  If I cant find a decent NAS then Ill have to
buy several
HDs and re-config my Linux server with much larger drives.
  
  
  ___
  Dave
Williams
  Check out our WebOS mobile phone app for the Palm
Pre
and Pixi: 
  Golf
Caddie | Golf
Caddie Forum | Golf
Caddie FAQ by DTW-Consulting, Inc.
  
  
  
  
  



  

  
  
  
  

  




--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] NAS

2010-04-06 Thread Erik Hjertén




Hi David

I have a ReadyNAS Duo [X-RAID] with two 1TB disks in raid 0 (mirrored).
I believe it's close to what you are looking at. It's mounted as NFS in
my Ubuntu server. I use this set-up in my home for personal as well as
business use. It has worked flawlessly since I installed it, I think it
was in october last year. I backup several clients to it as well as the
Ubuntu server it self. A full backup typically reaches a speed of 8
MB/s, but I haven't made any effort to tune the setup, I just plug and
pray ;-) . I also store some larger files on it and I stream video from
it as well. Works great.

What do you want to know?

Regards
/Erik

David Williams skrev 2010-04-06 23:11:

  
  

  
  
  I currently have a Western Digitial 1TB My Book
World device
which seems like its starting to play up, which is a shame because
Ive
hacked it (its basically a Linux server) so that I could mount the
drive
as NFS under Linux and perform backups using Backuppc. I havent look
into the problem with the drive too much, but after a few days of it
being
mounted I cant cd into the mount partition of the drive L
  
  Im looking to get a replacement which will
mainly be
used to backup various mount points on my home Linux server. I havent
done too much research yet but need something that is one the cheap
side, is
compatible with Linux and preferably will let me format the drives to
ext3 too,
or at least mount the drives such that I can perform backups using
Backuppc,
which requires hardlinks.
  
  I took a very quick look around and saw the
Netgear RND2210-100NAS
which looks pretty decent (states that its compatible with Linux) and
has 2 drives. Looks like it already comes with 2 x 1TB drives (no idea
what
type of drives though if it does indeed come with 2 HDs) and this would
be
great as I could use 1 on the 1TB drive for my backups and the other to
hold my
photos, music and videos, which would free up some space on my Linux
server.
  
  Has anyone ever used (or is currently using) the
above
Netgear device and could provide feedback?
  
  If I cant find a decent NAS then Ill have to
buy several HDs and re-config my Linux server with much larger drives.
  
  
  ___
  Dave
Williams
  Check out our WebOS mobile phone app for the Palm
Pre
and Pixi: 
  Golf
Caddie | Golf
Caddie Forum | Golf
Caddie FAQ by DTW-Consulting, Inc.
  
  
  
  

--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
  

___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
  


-- 


  


   Erik Hjertn
  erik.hjer...@companion.se
0708-90 55 30
  
Nsvgen 40, 139 33Vrmd
08-403 750 50
  www.companion.se
  

  




--
Download Intel#174; Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsyncd on vista 64 bit

2009-12-09 Thread Erik Hjertén
Kameleon skrev:
 I am trying to setup the standalone rsyncd from the backuppc downloads 
 page on a 64 bit vista machine. I have done it already on about 5 32 
 bit machines. Only this one fails to start the service. I see no error 
 other than it trys to run and then nothing. Has anyone else ran into 
 this issue and found a workaround? I don't want to use smb if I can 
 help it. Thanks in advance.
I'm running cygwin bundled in Deltacopy on Vista 64. I'm not sure if the 
Deltacopy team altered the cygwin-dlls in some way, but it works very 
well. I'm doing daily backups to a Linux based server running Backuppc 
via rsyncd. Deltacopy can be found here: 
http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp

Kind regards
/Erik



--
Return on Information:
Google Enterprise Search pays you back
Get the facts.
http://p.sf.net/sfu/google-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Restore via rsync on Vista client

2009-10-07 Thread Erik Hjertén

Hello all!

I have had a disk crash and now I need to restore my files. Restoring 
through as a ZIP-file file works fine, but I'm also interested in 
restoring directly with rsync, but I can't get it to work. All that is 
created is an empty folder where I choosed to restore the files. The log 
says all is fine:


2009-10-07 20:21:51 restore started below directory erikdata to host eddie 
http://kakrafoon.dyndns.org/backuppc/index.cgi?host=eddie
2009-10-07 20:22:11 restore 1 complete (80 files, 10326140 bytes, 0 dirs, 0 
xferErrs)

My setup:
Backuppc 3.0.0 on Ubuntu 8.04
Client is Windows Vista with Deltacopy 1.3 for rsync functions.

I have looked into the rights for the Deltacopy service running on the 
client. It's running as the local system, and the user SYSTEM has 
all rights on the created folder. My guess is that this should be 
sufficient.


Any thoughts, hints, solutions?

Thanks and kind regards
/Erik
--
Come build with us! The BlackBerry(R) Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9 - 12, 2009. Register now!
http://p.sf.net/sfu/devconference___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup via rsyncd is very slow

2009-09-21 Thread Erik Hjertén




Palmer, David W. skrev:

  Really???

What version are you using?

  

If it helps, I'm running Deltacopy 1.3, which bundles rsync for
windows, under Vista and XP on different clients. I'm running backups
with rsyncd method and a typical full backups last for about 145
minutes and contains about 52 GiB and approx 13 000 files. A typical
incremental is under 10 minutes, 2 GiB and say 10 to 500 files. Works
beautifully. You'll find Deltacopy here:
http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp It installs as a
service under windows if you want it to.

Hope this helps. Kind regards
/Erik


  David


-Original Message-
From: Michael Stowe [mailto:mst...@chicago.us.mensa.org]
Sent: Monday, September 21, 2009 11:21 AM
To: General list for user discussion, questions and support
Subject: Re: [BackupPC-users] Backup via rsyncd is very slow


FWIW, I've found that different builds of rsync/d and Cygwin differ
*dramatically* in their performance on Windows.

By dramatically, I mean after switching out my rsync binaries, a
representative full backup went from 220 minutes to 95, and a
representative incremental backup went from 200 minutes to 7.  Yes, seven
minutes, and the only thing that changed was the rsync binary on the
Windows system.

  
  
Hello Everyone,

I am currently trying to backup a windows 2008 file server via rsyncd. I
am having an issue with performance. The data on the server is around 1 TB
which is all user files. I have been trying to run a full backup of the
server for several days without success. The backup ran for over 72 hours
and only backed up around 200 GB.


* The server is on a on a gigabit lan with teamed NICs.

* The data is being written to a SAN with 12 spindles.

* I have already added another NIC and more  memory to the file
server.

I have left the compression at 3 but I have plenty of processing power in
both machines. Any ideas?

David

  
  

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


  



-- 


  


   Erik Hjertn
  erik.hjer...@companion.se
0708-90 55 30
  
Nsvgen 40, 139 33Vrmd
08-403 750 50
  www.companion.se
  

  




--
Come build with us! The BlackBerryreg; Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9#45;12, 2009. Register now#33;
http://p.sf.net/sfu/devconf___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] rsyncd on Vista 64-bit cygwin vs SUA

2009-08-18 Thread Erik Hjertén
Koen Linders skrev:
 I don't know what you mean with SUA environment, but I use Deltacopy in
 Vista 64 bit via rsyncd.

 http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp

 Works without a problem atm. Easy to use and you can copy the files to other
 computers and easily register the service.

   
I second that. Works like a charm.

Cheers
/Erik

--
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day 
trial. Simplify your report design, integration and deployment - and focus on 
what you do best, core application coding. Discover what's new with 
Crystal Reports now.  http://p.sf.net/sfu/bobj-july
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backup takes all files

2009-06-14 Thread Erik Hjertén

Matthias Meyer skrev:



-
incr backup started back to 2009-06-13 10:59:37 (backup #341) for
directory data Connected to eddie:873, remote version 29
Negotiated protocol version 28
Connected to module data
Sending args: --server --sender --numeric-ids --perms --owner --group -D
--links --times --block-size=2048 --recursive -D . . Checksum seed is
1244886594 Got checksumSeed 0x4a337642
Got file list: 11348 entries
Child PID is 13925
Xfer PIDs are now 13925
Sending csums, cnt = 11348, phase = 0
  create d 700 4294967295/42949672954096 .
  create d   0 4294967295/4294967295   0 2c8
  create d 700 4294967295/4294967295   0 2c8/Bliwa
  create d 700 4294967295/4294967295   0 2c8/Bliwa/data
  create d 700 4294967295/4294967295   0 2c8/Bliwa/data/log
  pool 700 4294967295/4294967295  48
  2c8/Bliwa/data/log/log.ctrl
  pool 700 4294967295/4294967295 1048576
  2c8/Bliwa/data/log/log3.dat
  pool 700 4294967295/4294967295  48
  2c8/Bliwa/data/log/logmirror.ctrl
  create d 700 4294967295/4294967295   81920 2c8/Bliwa/data/seg0
  pool 700 4294967295/42949672958192
  2c8/Bliwa/data/seg0/c10.dat
...
-






Also noticeable is the line: incr backup started back to 2009-06-13
10:59:37 (backup #341) for directory data above. Does this mean this run,
342, is compared to 341?


Yes
  
And if so, would that explain this behaviour? 


What do you mean?
  
My thought here was that the files not backed up in run 341 was instead 
backed up in run 342 just because they weren't in 341 and not because 
any attribute has changed. My intuition told me that an incremental run 
should be compared to the latest full, and not the latest incremental as 
it seems in this case. That's why I reacted to this line in the log.
Skipped files in 341 is marked pool in 342. 


You are sure that nothing was changed at this file(s) (timestamps,
attributes, owner)?
e.g.   pool 700 4294967295/4294967295  48 2c8/Bliwa/data/log/log.ctrl
haveaccess-rights 700
owner 4294967295
group 4294967295
size 48 bytes
  

Yes. Same file from 341:

skip 700 4294967295/4294967295  48 2c8/Bliwa/data/log/log.ctrl

And I think also it's not very likely that all files have changed that wasn't 
backed up in the last run.


Do you use --time as rsyncd parameter?
  

Yes, if you mean --times, see head of log example above.

As I said before, the $Conf{IncrLevels} is set to blank. To have a 
null value for this parameter is not described in the documentation, 
afaik,
so I'm a bit suspicious about that. I tried to set the IncrLevel to 1 
and now I have a perfect behaviour in my tests. Also, in the summary for 
the incremental runs above the level has the value ARRAY(0x8948138) 
which doesn't make any sense to me. Here is the full table:


Backup# Type Filled Level Start Date Duration/mins 
   Age/days Server Backup Path
336 full yes 0   7/6 11:30 223.6 7.0 
   /var/lib/backuppc/pc/eddie/336
337 incr no ARRAY(0x8947f0c) 10/6 11:32 8.0 4.0 
   /var/lib/backuppc/pc/eddie/337
338 incr no ARRAY(0x8947f0c) 11/6 11:32 173.9 
3.0 /var/lib/backuppc/pc/eddie/338
339 incr no ARRAY(0x8947ee8) 12/6 11:35 12.9 2.0 
   /var/lib/backuppc/pc/eddie/339
340 incr no ARRAY(0x89480e4) 12/6 21:07 223.2 
1.6 /var/lib/backuppc/pc/eddie/340
341 incr no ARRAY(0x8948138) 13/6 10:59 7.3 1.0 
   /var/lib/backuppc/pc/eddie/341
342 incr no ARRAY(0x8948138) 13/6 11:16 175.2 
1.0 /var/lib/backuppc/pc/eddie/342


So it seems the IncrLevel parameter solves this for me but I have to run 
some real backups to verify that.

Thanks
/Erik
--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backup takes all files

2009-06-14 Thread Erik Hjertén
Matthias Meyer skrev:
 I'm with you. But your incrementals will backup all files changed since last
 full. Do you want that?
   
Not necessarily, but now it behaves as expected.
 I make a full backup each 7 days and use $Conf{IncrLevels}  = [1, 2, 3, 4,
 5, 6];
 So the first incremental save all files changed since last full. All
 following incrementals will save changed files since last incremental
 backup.
 The famous backuppc GUI will fill each incremental backup with the files not
 backuped since last full (sorry for my bad english;-). So each incremental
 appears in the GUI like a full backup.
   
Sounds like a good plan. I'll read up the IncrLevel this evening. That 
and a glass of single malt will do the trick ;-) Thanks for your help.
/Erik

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backup takes all files

2009-06-13 Thread Erik Hjertén

 Please set $Conf{XferLogLevel} = 2;
 Than check the XferLOG after a backup.
 Look for:
 create  = new for this backup
 pool= found a match in the pool
 same= file is identical to previous backup
 skip= file skipped in incremental because attributes are the same

 But at first follow the hint from Les.

 br
 Matthias
 

 Now it is time for the hint above.
 You also should check and compare the permission-mask and owner/group,
 reported by the XferLOG.

   
I've set the loglevel to 2 and made a manual incremental run with #342, 
this is a small posrtion of the log:

-
incr backup started back to 2009-06-13 10:59:37 (backup #341) for directory data
Connected to eddie:873, remote version 29
Negotiated protocol version 28
Connected to module data
Sending args: --server --sender --numeric-ids --perms --owner --group -D 
--links --times --block-size=2048 --recursive -D . .
Checksum seed is 1244886594
Got checksumSeed 0x4a337642
Got file list: 11348 entries
Child PID is 13925
Xfer PIDs are now 13925
Sending csums, cnt = 11348, phase = 0
  create d 700 4294967295/42949672954096 .
  create d   0 4294967295/4294967295   0 2c8
  create d 700 4294967295/4294967295   0 2c8/Bliwa
  create d 700 4294967295/4294967295   0 2c8/Bliwa/data
  create d 700 4294967295/4294967295   0 2c8/Bliwa/data/log
  pool 700 4294967295/4294967295  48 2c8/Bliwa/data/log/log.ctrl
  pool 700 4294967295/4294967295 1048576 2c8/Bliwa/data/log/log3.dat
  pool 700 4294967295/4294967295  48 
2c8/Bliwa/data/log/logmirror.ctrl
  create d 700 4294967295/4294967295   81920 2c8/Bliwa/data/seg0
  pool 700 4294967295/42949672958192 2c8/Bliwa/data/seg0/c10.dat
...
-

Almost all files are marked pool and all directories are marked create. 
There are files marked skip and some same, but only a few.
The duration of this run was 175 minutes.

Before this I made another manual incremental run (with log level 1) #341. This 
run has almost all files marked skip
and only a few files marked pool. All directories are also marked create. 
This run lasted for 7 minutes.

Also noticeable is the line: incr backup started back to 2009-06-13 10:59:37 
(backup #341) for directory data above.
Does this mean this run, 342, is compared to 341? And if so, would that 
explain this behaviour? It seems that
files in run 341 which are marked create or pool are skipped in 342. 
Skipped files in 341 is marked pool in 342.

Also a bit peculiar is the fact that all directories are marked create but 
this is perhaps normal? 


/Erik - Now confused

--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Incremental backup takes all files

2009-06-12 Thread Erik Hjertén
Hi all

I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
The incremental backup runs inludes all files, but only every other day.
The backups on the local host (the ubuntu machine) seems fine and only
includes all files on full runs. What can cause this and how can I fix it?

Thanks!
/Erik

Example from Backuppc summary:

Host Eddie, windows vista
  Totals
Backup# Type #Files Size/MB
330 full 10688 46450.7
333 incr 83  989.3
334 incr 10693 46653.7
335 incr 1787   1442.2
336 full 12435 47538.2
337 incr 1071319.4
338 incr 12429 47477.1
339 incr 1061319.6

Host hotblack, windows xp
  Totals
Backup# Type #Files Size/MB
246 full 47339  26320.9
248 incr 46928  25978.3
249 incr 835 1790.9
250 incr 47289  26272.6
251 full 47648  26367.8
252 incr 655 1436.9
253 incr 47814  26361.2
254 incr 11391941.8

Local host, Ubuntu
  Totals
Backup# Type #Files Size/MB
498 full 81566  50782.2
500 incr14  1.4
501 incr 0  0.0
502 incr 0  0.0
503 incr57 16.6
504 incr16  0.7
505 full 81672  50872.0
506 incr 3376.7



--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Incremental backup takes all files

2009-06-12 Thread Erik Hjertén

Les Mikesell skrev:

Erik Hjertén wrote:
  

I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.



How do you call DeltaCopy from the Backuppc Server?
  
  

With Rsync. XferMethod = rsyncd

  
  

The incremental backup runs inludes all files, but only every other day.



What did you mean with but only every other day ?
  
  
Well, not every incremental run backs up all files. For example on host 
eddie ,see beow, incr backup# 333 is ok but 334 is not, it touches all 
files, and I believe it shouldn't? I mean I know all of these files 
haven't changed.



Unless you are doing different incremental levels, the two runs should 
be exactly the same from the server's perspective, doing the rsync 
comparison against the save full run as the base.  Can anything have 
changed on the files (timestamps, owner, etc.)?  I'm not sure what 
metadata it actually checks on a windows client or whether there is 
anything different about DeltaCopy in this respect, but rsync will show 
those differences as an update without spending any time copying the 
unchanged content.   But even then it doesn't make sense that #335 is 
normal again unless you have set incremental levels since it should 
still be checking against the last full.


  
I find it highly unlikely that all of the files have changed as this is 
a recurring pattern which is the same on these two clients. The only 
thing I can think of is that Deltacopy (or rsync) misinterprets 
something, perhaps after a reboot or sleep on the client, but that seems 
also unlikely.


The $Conf{IncrLevels} is left blank in the config file. The default is 1 
I believe. Could that cause the problem?


Also, the setting for XferMethod on localhost is tar, not rsyncd, which 
might explain why there is no problem with the localhost. So it seems 
the rsync is the lowest common denominator.

Regards
/Erik
--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/