Hi,
I'm quite sure this will not work because s3fs does not support hard links
(which are BackupPCs way of deduplication).
Kind regards,
Tom
> Am 19.01.2025 um 17:31 schrieb Jamie Burchell via BackupPC-users
> :
>
>
> Does anyone have any experience with using s3fs-fuse to mount an S3
> c
ost.pl file that will allow you to easily and
> reliably create (and later remove) VSS shadow copies on any Windows host
> so that all files (including locked ones) can be backed up.
--
Sincerely,
Thomas Trueten http://www.trueten.de
PGP Key Id: 0xD96D6E68 ava
n" networks? Is there any possibility to deal with
the multiple ip addresses a client can have?
tia, Thomas
--
Open source business process management suite built on Java and Eclipse
Turn processes into business applicat
r Apache configuration.
Thomas
Am 20.06.2014 18:57, schrieb Francisco Suarez:
> Thomas,
>
> This is my apache config file http://pastebin.com/krhVYj3h
>
> Not sure what I'm missing here and any help will be super
> appreciated.
>
> Looks like fast cgi is enabled in
believe is a cgi.
>
> Using Ubuntu 14 & Xampp lamp distro.
>
> Could you help?
>
> Thanks Much, Francisco
>
Do you use the Apache web server? Then you probably need ExecCGI
activated. See
https://httpd
Thanks, that was exactly the problem. After increasing the memory on the
server, the backup runs fine again.
Thomas
-Original Message-
From: Bowie Bailey [mailto:bowie_bai...@buc.com]
Sent: Montag, 27. Januar 2014 17:07
To: backuppc-users@lists.sourceforge.net
Subject: Re: [BackupPC
: about to system /bin/ping -c 1 ...
Can't fork: Cannot allocate memory at /usr/share/backuppc/lib/BackupPC/Lib.pm
Line 1341.
This line is "if ( $host ne " " ) {".
Any ideas?
Thank you in advance, Thomas
-
I followed
http://sourceforge.net/apps/mediawiki/backuppc/index.php?title=Change_archive_directory,
http://perlwannabe.typepad.com/blog/2011/04/backuppc-to-nas-over-nfs.html to
use a nas.
Best, Thomas
-Original Message-
From: SimonUtter [mailto:backuppc-fo...@backupcentral.com]
Sent
Hallo Stefan,
Jetzt ist mir noch aufgefallen, dass in dem Subnetz auch der DNS-Eintrag von
82.130.88.130 fehlt. Kannst du bitte den DNS makian.ethz.ch fuer diese IP-Nr.
setzen? Danke, Thomas
From: Marco [mailto:lista...@gmail.com]
Sent: Freitag, 30. August 2013 21:46
To: General list for user
Sorry for this mistake (wrong reply).
From: Löffler Thomas J.
Sent: Montag, 2. September 2013 10:42
To: 'General list for user discussion, questions and support'
Subject: RE: [BackupPC-users] A few questions
Hallo Stefan,
Jetzt ist mir noch aufgefallen, dass in dem Subnetz auch der D
nce it has fewer files than the prior one
(got 0 and 0 files versus 0)
Dump failed: fileListRecieve failed
--
Any idea on that kind of output?
Thanks in advance, Thomas
-Original Message-
From: Bo
Hi Sven,
Thanks for your input. I checked + rsync is installed on the client and
RsyncClientPath points to it. But it's the version 2.6.9. So I installed
additionally the latest version 3.0.9 of rsync. But the result remains the same.
Thank you, Thomas
-Original Message-
Fr
Hi Marco,
The following works for me.
$Conf{RsyncShareName} = {'/Users'}, i.e. backup where the user folders are + to
exclude the folders of Admin, Guest by $Conf{BackupFilesExclude} = {'*' =>
['Admin/','Guest/']}.
Hope it helps.
From: Marco [mailto:lista...@gmail.com]
Sent: Donnerstag, 29.
Hi,
Thank you for your suggestion. I logged on once accepting the remote key
fingerprint. In the meanwhile, I excluded the user's home folder + the backup
runs for the remaining folders. Can a file in a user's folder cause rsync to
get "Killed"?
Thank you in advance, T
nd "Aborting backup up after signal PIPE".
This machine is setup the same as the others with OS X 10.8.4 + the binding to
the backuppc has been done with the same script I wrote to exchange ssh keys.
Does anyone has any idea what could be wrong?
Thank you in advance, Thomas
-
with
Xfer PIDs are now 17860,17865
xferPIDs 17860,17865
Killed
I searched for it and read earlier responses to such error messages, but could
find a solution. Any ideas?
Thank you in advance, Thomas
--
Introducing P
nd" upgrades, I was wondering if
there is a way to determine if my installation has been performed using apt-get
or by hand?
Thomas Nilsson, CTO, Agile Mentor
Responsive Development Technologies AB
Web: http://www.responsive.se
Email: thomas.nils...@responsive.se
Phone: +46 70
Hi,
we using backuppc on an quad core system. Our backupprocess using only on core
for poolcompression. Is there a way to get Compress::Zlib working
multithreaded?
regards
Thomas Scholz
--
Netzbewegung GmbH | Pforzheimer Straße 132 | 76275 Ettlingen |
Geschäftsführer: Alwin Roppert
g. does that with option "-l").
A somehow "lazy" solution would be to just copy the "pool"-Files (hashes
as file names) by "rsync" and create a "tar" archive of the "pc"
directory. The time consuming process of link creation is then defe
On Thu, Apr 16, 2009 at 3:55 PM, Odhiambo Washington wrote:
> On Thu, Apr 16, 2009 at 4:34 PM, Thomas von Eyben
> wrote:
>>
>> HI List,
>>
>> Just once more my question
>> "Is anyone on the list actually running BackupPC on OS X?"
>>
>> A
e BackupPC server and
- hopfully - get some good advice.
TIA TvE
On Sat, Apr 11, 2009 at 11:12 PM, Thomas von Eyben
wrote:
> On Sat, Apr 11, 2009 at 9:25 AM, Thomas von Eyben
> wrote:
>> On Sat, Apr 11, 2009 at 2:01 AM, Les Mikesell wrote:
>>> Thomas von Eyben wrote:
>
On Sat, Apr 11, 2009 at 9:25 AM, Thomas von Eyben
wrote:
> On Sat, Apr 11, 2009 at 2:01 AM, Les Mikesell wrote:
>> Thomas von Eyben wrote:
>>> Hi list,
>>>
>>> Does anyone have a nice guide describing how to set up a Mac OS X's
>>> Apache (Se
On Sat, Apr 11, 2009 at 2:01 AM, Les Mikesell wrote:
> Thomas von Eyben wrote:
>> Hi list,
>>
>> Does anyone have a nice guide describing how to set up a Mac OS X's
>> Apache (Server or Client) to host BackupPC?
>>
>> I am now facing the proble
Hi list,
Does anyone have a nice guide describing how to set up a Mac OS X's
Apache (Server or Client) to host BackupPC?
I am now facing the problem:
"Error: Wrong user: my userid is 70, instead of 502(backuppc)" when
communicating with the CGI.
As I understand it I need (for performance) a se
On Thu, Apr 9, 2009 at 5:46 PM, Mirco Piccin wrote:
> gmail is the better friend for mailing list (threaded mail + enougt space)!
> Regards
I use my gmail account primarily for that specific "feature" -
mailinglist subscriptions, so far 35 of them!
My gmail acount is then growing into one large
ng for newer file version
etc.) is already a bad disk usage pattern by itself. And NFS didn't
handle it better either ...
Anyway, nice to know that a specialized NAS device could help there.
Thomas
signature.asc
Descr
but I
didn't look much into it. This way, you could "export" your USB disk
from a windows machine and "import" = mount it from any nbd-capable *nix
machine on the same network.
Thomas
signature.asc
Description: This is a digitally signed message part
--
ype
and can be accessed over the network, since you don't necessarily have
to plug the USB disk into the VM's host machine.
Performance is of course not really great but surprisingly good. I
evaluated (very shortly) against mounting via NFS, and NFS just sucks
with backuppc.
Thomas
than that, the cpool
entry could be missing ... e. g. when the file is in more than one
backup but the cpool entry got lost. Is it possible that this is an
issue here?
Thank you!
Thomas
signature.asc
Description
But how do I do that? BackupPC_link relies on the NewFiles
(similar name) file, so I cannot simply run it manually, can I?
Any other ideas?
Thank you,
Thomas
signature.asc
Description: Dies ist ein digital signierter Nachrichtenteil
-
> > is it possible to initiate a backup for a specific
> > host from command line?
>
>
> http://backuppc.wiki.sourceforge.net/Schedule_Backups_With_Cron
>
> But is there some reason you think the normal scheduler won't do it?
> If you have some other problem it would be better to fix it.
the timef
Hello List,
is it possible to initiate a backup for a specific host from command line?
I have one very special host, where I have to make sure, that backups
are definitely running daily..
I do not need special options, etc.. It would be enough if there is the
same functionality as klicking "start
Another idea how I could solve this? It's all about this space in
the --filter argument ...
Thanks,
Thomas
signature.asc
Description: This is a digitally signed message part
--
This SF.net email is sponsored by:
S
no "\" in my config.pl or .pl in the
--filter statement! It is as I described. (The output comes from manual
command line backup.)
But arguments are handled differently depending on whether I use rsync
locally, rsync via ssh or rsyncd. So carefully applying the "+" does the
trick after al
file-system
--filter=:-\\\ /nobackup.txt . /
Adding backslashes or quotes just worsens the situation and I get even
more escaping backslashes ... any ideas?
Thank you,
Thomas
signature.asc
Description: This is a digitally
epending on machine, memory, network and so on. So no real answer to
that, we have to investigate it further. If we find a reason for that
problem, I will post it.
Again, answers from all of you were very helpful!
Thomas Birnthaler
--
OSTC Open Source Training and Consulting GmbH / HRB Nuernberg
;
> What transfer method are you using?
rsync over GBit networks between Linux machines and also between MacOS
machines. In both cases that effect happens.
Thomas Birnthaler
--
OSTC Open Source Training and Consulting GmbH / HRB Nuernberg 20032
tel +49 911-3474544 / fax +49 911-1806277 / htt
incremental backups compared to full backups?
In both cases only changed/new files use disk space according to the hardlink
concept of BackupPC.
We have also detected, that in some cases incremental backups need much more
time than full backups (factor 3-5) This sounds odd to us.
Thanx
Thomas
d the
chance I would switch back, since under the rest of the load
conditions of backuppc, ext3 clearly performs better for me.
Unfortunately, it takes 2 or 3 days to do this switch, so it might not
happen for a while.
Thanks again!
-Thomas
On Sat, Dec 20, 2008 at 2:47 PM, dan wrote:
> true
is is getting more reasonable.
-Thomas
On Thu, Dec 18, 2008 at 10:15 PM, Thomas Smith wrote:
> Hi everyone, thanks for the help!
>
> Today around noon I remounted the backup disk with noatime, and then
> it only took another three hours, rather than another 10, which is
> exciting. I
ageable time, I'll probably just split
the nightly clean across several days, as you say.
Thank you again!
-Thomas
On Thu, Dec 18, 2008 at 8:33 PM, Holger Parplies wrote:
> Hi,
>
> Adam Goryachev wrote on 2008-12-19 10:56:44 +1100 [Re: [BackupPC-users]
> backuppc 3.0.0: another x
Hi,
No, it continues to take 22 hours or so each day.
-Thomas
On Thu, Dec 18, 2008 at 1:19 PM, Paul Mantz wrote:
> Hello Thomas,
>
> Did the BackupPC_nightly jobs take 22 hours on the 17th as well? If
> they didn't, I would suspect that since you restored the TopDir from a
&g
stall the patch for
the Dirent problem? Or is it maybe not an XFS bug, and I did
something wrong when I restored the BackupPC filesystem? Something
else?
Thank you for your help,
-Thomas
--
http://resc.smugmug.com/
--
on both servers: identical file system
and size etc.
- if not on a cluster file system, only one side gets to read/modify the
data
Thomas
signature.asc
Description: This is a digitally signed message part
--
SF.Net email
On Thu, Nov 27, 2008 at 8:22 PM, Ski Kacoroski <[EMAIL PROTECTED]> wrote:
> As far as I am concerned, BackupPC scales very well. I run 200 - 250
> clients onto 3 year old single proc, 1GB ram, and (4) 250GB sata disks
> in a raid 5. The limiting factor is the I/O throughput.
How do you configure
uld export an image file via NBD to you
linux machine [1]. But you won't be able to access the data from your
Windows machine.
Thomas
[1] http://www.vanheusden.com/Loose/nbdsrvr/
signature.asc
Description: Dies is
at /usr/lib/perl5/site_perl/5.8.8/IO/Compress/Base/Common.pm line 567.
2008-09-19 02:23:29 Backup failed on main (Child exited prematurely)
Thanks,
Thomas
signature.asc
Description: Dies ist ein digital signierter Nachrichtenteil
-
level 0 - but that's not really what I want ...
The same BackupPC installation is also backing up two Windows machines
via smb, one with about 1 GB and the other with about 18 GB of data, and
it does this just fine.
Any help about these apparently random aborts is very much
appreciated ... thank
her with about 18 GB of data, and
it does this just fine.
Any help about these apparently random aborts is very much
appreciated ... thank you!
Thomas
signature.asc
Description: Dies ist ein digital signierter Nachrichtenteil
-
Hi Les,
Thanks for making this more clear to me. We'll go for the second backuppc
install, and just overwrite it's pool if we need to restore from tape (making
sure we tar the whole pool to tape every now and again).
Cheer
/Thomas
L
Dear community,
We're backing up our pool to tape every now and again, but we would like to
extract this data into the pool at a later stage.. in case we need to recover.
Here's what I'd like to achieve:
1. Back up pool to tape using tar (works)
2. Extract backup from tape and import it into b
Could not reliably determine the server's fully qualified domain
name, using 127.0.0.1 for ServerName
Thomas Mederer
-Ursprüngliche Nachricht-
Von: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] Im Auftrag von Nils
Breunese (Lemonbit)
Gesendet: Freitag, 15. August 2008 11:12
A
rsync perl libfile-rsyncp-perl par2 bzip2
I use Ubuntu 8.04
I hope someone can tell me what it is. But unfortunately I'm not really fit
in linux
Greeting
Thomas
-
This SF.Net email is sponsored by the Moblin
ter the backup disk space has been filled up?
Any help is much appreciated.
Thanks,
Thomas
Mit freundlichen Grüßen
Kassenärztliche Vereinigung Thüringen
Zum Hospitalgraben 8
D-99425 Weimar
Tel: +49 (3643) 559 0
Fax: +49 (
Hi Nicholas,
sorry .. that one should go to you :)
> I've updated the patch with a new DS to hold and graph
> the value of the pool size prior to pooling and compression.
> Attached is the patch and a sample image. You will need
> to delete the existing RRD file ($LogDir/pool.rrd)
> as the scrip
Hello Ludovic,
> I've updated the patch with a new DS to hold and graph
> the value of the pool size prior to pooling and compression.
> Attached is the patch and a sample image. You will need
> to delete the existing RRD file ($LogDir/pool.rrd)
> as the script creates a new one with an additiona
Nils wrote:
| I like to keep things separated. BackupPC does backups,
| my monitoring software does the monitoring.
| Also keeps upgrades simple.
| But yeah, I can see others might like this.
I agree, but as a compromise - it would be a nice option
to ./configure --with-graphs :)
So, every
Hello,
| Could somebody give me (total newbie) a short howto for
| installing the Nagios-Plugin "check_backuppc"
| (http://n-backuppc.sourceforge.net/)
|
| Running System: Debian-Etch; Apache2; Nagios 1.4.
not tested (yet), but just take a look at README & INSTALL provided with
the pack
erent needs and preferences. I have several other
partitions to monitor too, so I will have to check my monitoring tool
anyway. Let BackupPC do what BackupPC does best, and leave monitoring
to a monitoring app.
--
Thomas Nygreen
---
Hello,
Is there any way to execute trashClean manually? I deleted some directories
and wanted run the trashClean to regain disk space.
Tom
-
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to
f ( !close($t->{pipeTar}) && $? != 256 ) {
That works, thanks a lot! :)
[...]
> Regards,
> Holger
regards,
Thomas
-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express
Craig Barratt schrieb:
> Thomas writes:
>
>> I am using BackupPC 2.1.2 on a Debian (Etch) server. BackupPC is
>> configured to use tar via ssh to backup a /home-dir (around 80 GB).
>>
>> During or after (I don't know) executing a job I got follwing message
>
0 (76GiB, 7,8MiB/s)
link 192.168.0.5
--->
How can I find out reason of the problem "dump failed"?
regards,
Thomas Guenther
-
This SF.net em
filled
unless you ask for it in the config settings.
http://backuppc.sourceforge.net/faq/BackupPC.html#item__conf_incrfill_
> I've configured backuppc with tar and compression 9.
There's not much point in setting compression level 9.
http://backu
2007/7/1, Matthias Meyer <[EMAIL PROTECTED]>:
> 0full160701000.10.924950.000.0
> 1incr20562.90.4700.000.0
> 2incr28165.10.6640.00.0
> 3full917241914.90.534370.00.0
> 4
PC will remove last
weeks backup, and every other second week it will keep it.
If what you want is to keep the last three backups, set
$Conf{FullKeepCnt} = 3; You can also set it to [2,1,1] to be sure you
have the last two full backups.
-thomas nygreen
-
y in the pool, but backed
up from some other host or another path on the same host, it will not be
transferred, only mapped/remapped?
thomas nygreen
-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - t
David Rees skrev:
> On 6/13/07, Francis Lessard <[EMAIL PROTECTED]> wrote:
>> I currently use BackupPC 3.0.0 to backup 2 www servers. As bandwidth cost a
>> lot, I would like to use BackupPC similar to a commercial online backup
>> service we use. This service does a full backup only once, then do
esn't.
That means '/path/*.mp3' will match '/path/music.mp3' but not
'/path/folder/music.mp3'. '/path/**.mp3' will match both
(and BackupPC just passes the arguments to rsync so there's no difference)
thomas nygreen
---
Klaas Vantournhout skrev:
> Hi,
>
> I was wondering if you can dequeu a backup for an unknown time?
>
> Thanks
> Klaas
You can set $Conf{FullPeriod} = -1; in the config file for the host you
don't want to b
tree, I
located the problem to /sys/bus/pci_express. I don't see why. So I
excluded '/sys/bus/pci_express' and now it works just fine.
Thanks for leading me to the right track!
Thomas Nygreen
-
This SF.ne
;/pub','/tmp','/proc','/media','/sys/bus/pci/drivers','/var','/root','/usr','/bin','/lib'];
I hope someone can help.
Regards,
Thomas Nygreen
--
can be backuped
- and after the 'BackupPC_nightly' thread was run - the pc should
automatic shutdown
-- <http://dict.leo.org/ende?lp=ende&p=5qvU.&search=best>
best regards
thomas
-
This SF.net e
Thanks Victor,
I will setup custom config.pl files for notebooks/desktops to restrict the
backup periods.
Tom Maguire
[EMAIL PROTECTED]
- Original Message -
From: "Víctor A. Rodríguez" <[EMAIL PROTECTED]>
To: "BackupPC Userlist"
Sent: Wednesday, June 14, 2006 8:30 AM
Subject: Re: [Back
I want most systems on my network to backup
overnight but I have several notebooks that are only attached to the network
during business hours. I reviewed the blackout settings in config.pl and wanted
to know if I was interpreting them correctly.
It seems that if I use the following setti
Thanks Les. I will change the setup and see if it resolves the problem.
Tom Maguire
[EMAIL PROTECTED]
- Original Message -
From: "Les Mikesell" <[EMAIL PROTECTED]>
To: "Thomas Maguire" <[EMAIL PROTECTED]>
Cc: "BackupPC Userlist"
Sent: We
I recently installed BackupPC 2.1.2 on Linux
Mandrake 10.0. I edited the configure.pl and hosts files for one PC to test
the operation of the software.
The LOG and backuppc file system show that the PC
is being backed up.
My problem is with the BackupPC_Admin script.
I changed the http
archives and the Docs and I didn't
come up with a definitive answer.
--
MOORE,JUSTIN THOMAS
---
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems? Stop! Download the new AJAX search engine that
78 matches
Mail list logo