Evren Yurtesen wrote:
I am saying that it is slow. I am not complaining that it is crap. I
think when something is really slow, I should have right to say it right?
There is such a thing as tact. Many capable and friendly people have
been patient with you, and you fail to show any form of
nilesh vaghela wrote:
> Other 25% pc backcup is dead slow. The data transfer is in 20kbps.
> I found few things might cause problem.
>
> 1. Space within the directory name. ( I do not know but seems to be)
> 2. Tree structure
> 3. " ' " single quote in directory name cause problem.
>
> Presently we
Evren Yurtesen wrote:
Totals Existing Files New Files
Backup# Type#Files Size/MB MB/sec #Files Size/MB
#Files Size/MB
245 full152228 2095.2 0.06152177 2076.9 108 18.3
246 incr118 17.30.0076 0.
Jesse Proudman wrote:
> I've got one customer who's server has taken 3600 minutes to
> backup. 77 Gigs of Data. 1,972,859 small files. Would tar be
> better or make this faster? It's directly connected via 100 Mbit to
> the backup box.
>
>
First, determine your bottleneck. Is it dis
Evren Yurtesen wrote:
> Jason Hughes wrote:
>> That drive should be more than adequate. Mine is a 5400rpm 2mb
>> buffer clunker. Works fine.
>> Are you running anything else on the backup server, besides
>> BackupPC? What OS? What filesystem? How many fil
Use Rsyncd. It runs as a service on each client box as root (or some
other user with appropriate disk privileges), and the backuppc client
gains no user privileges on the client box, rather it communicates to
retrieve data. There is no real client push model for BackupPC, only
protocols with l
Evren Yurtesen wrote:
>> And, you could consider buying a faster drive, or one with a larger
>> buffer. Some IDE drives have pathetically small buffers and slow
>> rotation rates. That makes for a greater need for seeking, and worse
>> seek performance.
>
> Well this is a seagate barracuda 720
Evren Yurtesen wrote:
> I know that the bottleneck is the disk. I am using a single ide disk to
> take the backups, only 4 machines and 2 backups running at a time(if I
> am not remembering wrong).
>
> I see that it is possible to use raid to solve this problem to some
> extent but the real solu
This has been discussed before, several times. Most of the
recommendations say, in no particular order: * drop your compression
level, * don't use RAID5, * try using something other than rsync, *
reduce the number of simultaneous backups, * increase memory on the
server, etc...
For what it's
Michael Mansour wrote:
I'm wondering why the full backups numbered 2 are not going back
down to 1 to free up some space on the server?
In the glbal Schedule, I have the following:
FullPeriod: 6.97
FullKeepCnt: 1
FullKeepCntMin: 1
FullAgeMax: 7
and it's my understanding that backuppc should cy
Frej Eriksson wrote:
I sent an e-mail to the list last week and got good answers so now i
have tested BackupPC for a short time, the result has been satisfying.
But as always some new questions has poped up. Lets presume that the
server that runs BackupPC and stores all backed up data crashes.
You cannot create hardlinks on an FTP site. BackupPC won't really do
what you want without that.
JH
Henrik Genssen wrote:
> Hi,
>
> I have some FTP storage at may provider as backup space.
> On my server I have backuppc installed to backup some VMs.
>
> Is it possible to use that ftp-device as
Rick,
This may actually be something like what I'd run into a while back, but
with read permissions and on XP Pro. For whatever reason, using Samba,
I could not get some directories to backup properly unless I set a
password on the account I was using to access the shares remotely. I
verifi
Michael Mansour wrote:
> I'm wondering why the full backups numbered 2 are not going back down to 1 to
> free up some space on the server?
>
> In the glbal Schedule, I have the following:
>
> FullPeriod: 6.97
> FullKeepCnt: 1
> FullKeepCntMin: 1
> FullAgeMax: 7
>
> and it's my understanding that ba
Peter,
For testing purposes, you may reduce the alarm period, but under
practical circumstances, it must be large enough that it doesn't cut off
backups that would finish, had they been given the time to collect
enough file information. The behavior also depends on the transport
mechanism yo
John,
IMO, the point behind BackupPC is to use cheap, easily upgradeable disk
media to make backups available and easy. That kind of steers me in the
direction of several low-end backup servers, either with separate
storage or all sharing a big fat fiber channel NAS. Buying a high end
machin
You might not get a response that helps you here. This list is
specifically for supporting backuppc users, and your question is
regarding a 3rd party plugin for some other system entirely. Check any
readme files provided with the plugin to locate the author. I couldn't
easily find the contac
If the whole share is empty, that is considered indistinguishable from a
general failure. You can control that with
$Conf{BackupZeroFilesIsFatal}. Check the docs for more details.
JH
Brendan Simon wrote:
> I'm getting a fatal error when backing up an empty directory.
> BackupPC server is runn
OverlordQ wrote:
> The Unicode versions of several functions permit a maximum path length
> of approximately 32,000 characters composed of components up to 255
> characters in length. To specify that kind of path, use the "\\?\" prefix.
>
> http://msdn2.microsoft.com/en-us/library/aa365247.aspx
>
>
All versions of Windows have a limit of 250-ish characters maximum for a
full path, including the filename and extension, regardless of file
system. I'm not aware of a lower limit imposed by the file system or
OS, but it's likely related. Are you running Unicode-16 character set
or UTF-16 on
Jim,
Here is a snippet from my exclude list, which works using rsyncd on a
Win2k box:
/Documents and Settings/*/Local Settings/Temporary Internet Files/*
The spaces are not a problem, for me at least. But, I did have
considerable difficulty getting rsyncd.conf to behave when I placed the
sh
Tareq,
That error means you logged in, but for some reason (usually permissions
problems), your logged in user cannot see those files.
Have you tried to log into that share manually on your Linux box, using
a similar command line? You can leave off a few flags and just log in
to poke around.
Jeff Schmidt wrote:
On Fri, 2007-02-23 at 16:47 -0600, Jason Hughes wrote:
Basically, I'm looking for a quick way to find the filenames and/or
backup numbers I should use to get at all versions of a particular file.
how 'bout a commandline browser?
something like:
li
Hey all,
I suddenly had an urge to do some work on a particular configuration
file and wanted to determine all the changes that had occurred to it
over the lifetime of its backups. Is there a simple command line tool
that shows all the revisions that have been transferred to my pool,
either p
Jason B wrote:
> However, the transfer always times out with signal=ALRM.
>
[...]
> Somewhat unrelated, but of all these attempts, it hasn't ever kept a
> partial - so it transfers the files, fails, and removes them. I have
> one partial from 3 weeks ago that was miraculously kept, so it keeps
>
Jim McNamara wrote:
I am having a problem with rsyncd between backuppc and a remote
windows box running the rsync package provided on the backuppc page at
sourceforge. I installed a clean version of backuppc 3.0.0, moving all
the old backups from 2.1.1 to an alternate machine should I need them
Nils Breunese (Lemonbit) wrote:
> Any ideas on how we can reduce the load? More/less nightly jobs? Less
> concurrent backups? Other tips? We used to backup 15 servers onto one
> BackupPC server, but now almost all of our backups are failing and the
> load is through the roof. Can we just go and
Perl likes this:
$string = "'Hello ' . `executethis` . ' test\n'";
You probably want to surround the whole string in single quotes, but use
dot-concatenation to string the pieces of command together. I didn't
try what you have below, but I did notice that backticks weren't being
executed if t
Rob Shepherd wrote:
> Thanks for the reply.
>
> Forgive my ignorance, but if the files are not in a "direct access"
> format, then how does rsync work?
> rsync compares local and remote file trees before sending deltas etc.
>
> Does the rsync perl module do some translation magic or somesuch?
>
You might want to check that your perl is in /bin/perl. It's probably
in /usr/local/bin/perl or /usr/bin/perl. Simply change the line at the
top of the script to whatever this command tells you:
which perl
Hope that helps,
JH
Dienelt Václav wrote:
>> Hello,
>> I have problem with startin
Hi Craig,
First and foremost, love the software. Thanks so much.
I noticed that in 3.0.0, if you try to edit any of the config from the
web interface, it refuses to save for me on the basis that ParPath
points to a utility that I do not have installed. Funny thing is, I
don't recall par bein
Could be many things:
- Make sure you have added the tags in your
httpd.conf that point to BackupPC.
- Make sure you have restarted httpd so it reads the config.
- Check that your htpasswd file has been created for authorization
purposes. This file will contain all the 'users' that can use
Dave Fancella wrote:
> I went ahead and did this for now, but it's still not quite the right
> solution. The laptop is dhcp because it periodically goes wardriving, so a
> solution where I can have it dhcp is best. :) Still, it'll be some months
> again before I might need it to leave the hou
When I say 3 different locations, I don't mean 3 different floors of
the same building. I mean three different client sites, miles apart,
with completely different *everything*, including network hardware
brand. Some of them are HP ProCurve switches (our preferred brand)
but nowhere near al
Timothy J. Massey wrote:
> [EMAIL PROTECTED] wrote on 02/01/2007
> 12:22:18 AM:
>
> > Timothy J. Massey wrote:
> >
> > > rsync: read error: No route to host
>
This one would concern me most. I thought there was a note somewhere in
the docs that says clients should have reverse DNS set up f
James Ward wrote:
> it looks like they're going to all get started at the same time again
> due to waiting on the nightly process to complete after the longest
> of these backups.
>
> Does version 3 get me away from this scenario?
Yes. Version 3 doesn't need nightly processing to be mutually
Willem Viljoen wrote:
> I have inserted the username and password required to make backups and
> it works, full and inrcremental. When turning "Use simple File Sharing"
> of - incremental backups fail with the message: "backup failed (session
> setup failed: NT_STATUS_LOGON_FAILURE)". My printer
[EMAIL PROTECTED] wrote:
> Maybe I shouldn't chime in, because I've only been half following this
> thread, but I can't help wondering if you've looked into all the
> firewall/timeout possibilities? Sometimes those settings get hosed during
> an upgrade too.
>
>
Not a bad thing to look into. I
Simon Köstlin wrote:
> I think TCP is a safer connection or plays that none rolls?
> Also when I click on a PC in the web interface it takes around 20-30 seconds
> until the web page appears with the backups which were made. I thought that
> would be better with an other connection. But that time i
All of my excludes look like this:
$Conf{BackupFilesExclude} = ['/proc', '/var/named/chroot/proc', '/mnt',
'/sys', '/media'];
$Conf{XferMethod} = 'rsyncd';
$Conf{RsyncShareName} = 'wholedrive';
They seem to work fine. I'm using 3.0.0beta3. Is your rsync share name
correct? Shouldn't your ex
Clemens von Musil wrote:
if version 3 will turn from beta to stable - will it be possible to
migrate an existing system, with filespool etc., towards the newer
version?
Is it already now possible to outline what I need to do the migration?
You mostly just download and install over the existing
Byron Trimble wrote:
> All,
>
> All of a sudden, none of my backups (rsync) are working. I'm getting "Unable
> able to read 4 bytes" for each backup. Any insight?
>
>
>
I had this happen to me when I had an old File::RsyncP version using
protocol 26 trying to connect to rsyncd that was at proto
As silly as it may sound, I have had some success using VitualPC or
VMWare or similar PC simulators rather than trying to restore a Windows
PC from scratch. The beauty of it is, you can have several sitting
around on the hard drive of the host OS, and when one crash and burns
(as Windows inevi
Arlequín wrote:
> Hello, David.
>
> I use a stand alone rsync + cygrunsrv install.
> The service rsync.exe is reported as running by user SYSTEM.
>
> SYSTEM has all the perms activated on directory
> C:\Documents and Settings\jdoe\Desktop
>
> But I'm getting chdir failde when rsync'ing.
> rsync -av
Joe Casadonte wrote:
> Using 3.0.0beta3, backup client is WinXP Pro via rsyncd.
>
> I have an 80 GB USB hard drive that I'd like to back up if it's
> connected. If it's not, then I'd like the rest of the laptop backed
> up. I have 'BackupZeroFilesIsFatal' unchecked. Here's what I get in
> the lo
The BackupPC system is a server-pull model. There is no such thing as a
missed backup because the server keeps the schedule. If the server is
down, the backups will run as soon as they are allowed to run (taking
into account blackout periods and minimum uptime requirements). Making
two or mo
This was happening to me when I was using rsyncd and File::RsyncP on the
server that ran protocol version 26. Upgrading it to run protocol 28
with CPAN fixed my problem. You said ssh+rsync, not rsyncd tunneled
through SSH right? So maybe this doesn't apply to you.
JH
Randy Barlow wrote:
> H
From the documentation:
Other installation topics
*Removing a client*
If there is a machine that no longer needs to be backed up (eg: a
retired machine) you have two choices. First, you can keep the
backups accessible and browsable, but disable all new backups.
Alternatively, yo
Unfortunately, yes.
What you might want to do is put some of the larger directories in the
BackupFilesExclude folder for that client. Then, do a full backup.
After that backup succeeds, remove one of the excluded folders and
trigger another backup. Rinse, repeat.
This way you will populate
Timothy J. Massey wrote:
> The C3 is slow. I get it. I already *knew* that. However, the
> performance numbers I posted demonstrate pretty clearly that the
> failure is not in a simple lack of CPU power, but in truly how *much*
> CPU power rsync demands. I get triple the performance in swit
Sorry, I'm not great at deciphering linux diagnostics (I'm relatively
new to it--a year or two), but I did a little poking around to see what
might be causing trouble. Wikipedia had these choice bits to say about
the C3 chip design:
C3
* Because memory performance is the limiting fa
You might consider doing a little Perl script rather than shell for the
formatting script. At least that way, you can launch the format command
as a pipe, read its output (the 11/25000 followed by a bunch of ^H
characters to back up over itself), parse it, then output something more
meaningful
[EMAIL PROTECTED] wrote:
I routinely hit 100% CPU utilization on the Via C3 1GHz Mini-ITX
systems I use as backup servers. I will grant you that the C3 is not
the most efficient processor, but I am definitely CPU-limited. I too
have 512MB RAM, but the machines are not swapping. And that's
Holger Parplies wrote:
> Paul Harmor wrote on 01.01.2007 at 20:51:43 [[BackupPC-users] OK, how about
> changing the server's backuppc process niceness?]:
>
>> I have only 2 machines (at the moment) being backed up, but every time
>> the backups start, the server system slows to an UNUSEABLE cra
> I'm wondering now how to exclude things like "/proc" globally and per-PC.
>
You cannot exclude something globally, then exclude more per-PC. The
per-PC settings simply override whatever was set globally, since it's
just setting a perl variable. I suppose you could actually write perl
cod
is current enough?
>
> Thanks again,
> Jim
>
>
> On 12/14/06, Jason Hughes <[EMAIL PROTECTED]> wrote:
>> You may need to stop/restart the rsyncd service to make it read the
>> rsyncd.conf file on windows.
>>
>> I wanted to mention that there was some bug
You may need to stop/restart the rsyncd service to make it read the
rsyncd.conf file on windows.
I wanted to mention that there was some bug that I ran into (you're not
seeing it yet) when the backuppc was using protocol version 26 and
windows running rsyncd. You might want to update the serve
I had this happen to me. In my case, I had an old version of
File::RsyncP. If you go to cpan and type 'install File::RsyncP', it
will tell you if you are up to date or not. The older protocol (v.66 I
think) had a bug in it. I recall the new version is v.68 or v.69.
Adjusting the timeout wil
For what it's worth, I started with 2.1.2pl2 stable and had per-machine
configs for each machine working fine. When I installed the 3.0.0beta3
as an upgrade OVER the existing install, it worked fine. Maybe there's
something different about the install script that differs between
upgrade and a
scartomail wrote:
> Hy Evryone,
>
> Let say I got this in my rsyncd.conf on my windows box:
> [Foto]
> path = E:/FOTO
>
> Is there anyway to add more directory's to the path variable?
It wouldn't make much sense to do that, because multiple paths would
then need to be "merged" as a single
I have a Win2k box running the rsyncd package. It is over an 802.11g
link (about 1MB/s throughput when copying via windows shares manually,
but over rsync it's getting closer to 350k). Thus it takes about 40 or
so hours to backup the system. I've taken to excluding tons of stuff
just to get
elow and get
"NT_STATUS_ACCESS_DENIED". Again, If I change the command from
"/usr/bin/smbclient gandolf\\C$ -U backup" to "/usr/bin/smbclient
gandolf\\Eric -U backup" it connects to the Eric folder just fine.
I don't know that much about sharing and
Are you missing a double-quote on the AuthName line? That might confuse
the parser, causing who knows what problems.
JH
Krsnendu dasa wrote:
> AuthName "BackupPC
>
>
>
-
Take Surveys. Earn Cash. Influence the Future of
I had no luck with getting rsyncd on Windows to work with spaces in
filenames through the config file. I resorted to using the 8.3
filenames instead, ie:
secrets file = c:/Progra~1/rsyncd/rsyncd.secrets
The easy way to find them is to do a "dir /x" and you get both the long
and short names.
H
Eric,
You may need to do two things... Are you creating a user on the Windows
box that is a member of the backup operators group? You can either do
that, or use the administrator account. In either case, you will
probably want to use a username that exists on the Windows machine that
*requi
You can do this:
cpan
install Compress::Zlib
It should either fetch and compile the perl module, or tell you that it
is already up to date.
JH
Ariyanto Dewo wrote:
> Hi all,
> Thanks for the respond my last message 'backup
> backuppc', I am able to
> figure it out to works. But now I have a pro
Fabio,
Usually, when it complains that Apache can't connect to BackupPC, it's
because BackupPC isn't running. You can log in as root, then do
'service backuppc restart' and see what it does. It should shut down
first, then start. I expect it to say 'failed' when trying to shut
down, becaus
Craig Barratt wrote:
> Jason writes:
>
>
>> Since I finally got 2.1.2pl2 working, I decided to upgrade to 3.0.0beta2
>> (a glutton for punishment, I am). Everything went swimmingly until I
>> tried to look at any logs or view the config files either for clients or
>> the general system, via
Since I finally got 2.1.2pl2 working, I decided to upgrade to 3.0.0beta2
(a glutton for punishment, I am). Everything went swimmingly until I
tried to look at any logs or view the config files either for clients or
the general system, via the CGI interface.
Here's what I get from the web serve
scartomail wrote:
> I'm can not login with a normal user.
> The only user I can login with at http://backuppc-server/backuppc/ is
> "backuppc" ?
> [...]
> Still unable to logon to backuppc.
> I did notice the file /etc/backuppc/htpasswd in witch only the user
> "backuppc" is mentioned.
> Here mi
I was using rsync and was consistently getting BackupPC_dump to
segfault, leaving an orphaned child process doing the rsync that would
never terminate. So I switched to rsyncd. Same business, except now I
have an rsyncd log on the client machine. Segfaults.
So I upgraded my File::RsyncP on a
Using 2.1.2pl2:
I recently switched from using rsync (because it was getting a few
hundred megs into a backup then giving me an (Unable to read 4 bytes)
error, and not even keeping a partial), to using rsyncd. I got this
working right earlier today and kicked off a manual full backup, ie.
Ba
Make sure you have set an admin user to be the user name that should
have complete access to BackupPC from the CGI:
$Conf{CgiAdminUsers} = 'panther';
Without this set, anyone you log in as is only a user, and can only see
the machines that the hosts file declares to be associated with that
Craig Barratt wrote:
Jason writes:
This took 40 hours to run, and backed up a lot, but when it got to a 12gb file, it choked.
Here's the XferLog Errors:
Error reading file \video\2005\video2005raw.avi : Call timed out: server did not respond after 2 milliseconds
Didn't get
I did get it to transfer 10gb of the 12gb file manually using
smbclient. For whatever reason, I guess there was a 20 second gap in
the transfer there and it timed out. I had to shut down smbclient,
then open it again to establish a good connection to the server, and
I'm using 'reget' to get t
Les Stott wrote:
I thought maybe excluding that particular file would help, but exclusions aren't
working well for me. I tried to exclude like this:
$Conf{BackupFilesExclude} =
['Documents and Settings/Administrator/Local Settings/Temporary Internet Files/*'];
And it backed up
I'm using it to download it right now. It's over a slower connection
(475k/s sustained rate) so it will take at least 7 hours to download,
if it runs at max speed the whole time. I'll let it run and see what
happens. Good suggestion.
Thanks,
JH
[EMAIL PROTECTED] wrote:
Can you manually
Hi all. Nobody has responded to my other messages requesting help, so
I'm trying again. I'm using the 2.1.2 version.
I have one Windows machine that is backing up flawlessly (other than
NT_SHARING_VIOLATIONs that are unavoidable). I have another that is
failing when it gets to a very large
Yes. I did:
[EMAIL PROTECTED] ~]$ ssh [EMAIL PROTECTED] echo \$USER
root
JH
Les Mikesell wrote:
On Thu, 2006-11-02 at 15:24, Byron Trimble wrote:
I'm using 'rsync' and I have setup all the ssh keys and tested them.
Did you test them running as the backuppc user on the
I asked about
below?
Thanks,
JH
Jason Hughes wrote:
I'm having trouble getting backups to work with rsync. I have two
hosts using smb that are working (sort of), and two with rsync that are
not. Here's the log file I get (the machine name is 'sol'):
[...]
The
I'm having trouble getting backups to work with rsync. I have two
hosts using smb that are working (sort of), and two with rsync that are
not. Here's the log file I get (the machine name is 'sol'):
Contents of file /var/backuproot/pc/sol/LOG, modified
2006-11-01 12:21:42
2006-11-01 12:21:42
Hi all,
I've just got backuppc set up for the first time on a Centos 4.4 box
using the provided RPMs, with mod_perl. It was a real challenge because
it seems somehow to be using mod_perl2, whereas Centos only has
1.999xxx. Very confusing. At any rate, it's working with Apache
running as use
82 matches
Mail list logo