ld always install it manually if you can't get macports
> to work.
>
>> Wow, guess this is uncharted territory.
>>
>> On Fri, Nov 13, 2009 at 7:24 AM, Nick Smith
>> wrote:
>>> On Fri, Apr 17, 2009 at 4:39 AM, Thomas von Eyben
>>> wrote:
>>
Wow, guess this is uncharted territory.
On Fri, Nov 13, 2009 at 7:24 AM, Nick Smith wrote:
> On Fri, Apr 17, 2009 at 4:39 AM, Thomas von Eyben
> wrote:
>>>>
>>>> HI List,
>>>>
>>>> Just once more my question
>>>> "Is anyone
On Fri, Apr 17, 2009 at 4:39 AM, Thomas von Eyben
wrote:
>>>
>>> HI List,
>>>
>>> Just once more my question
>>> "Is anyone on the list actually running BackupPC on OS X?"
>>>
>>> As described below I am having some difficulties in getting the perl
>>> environment satisfactory for using BackupPC a
Why do i keep getting these messages sent to me directly as a reply
from the list?
On Wed, Jun 24, 2009 at 2:55 PM, wrote:
> The original message was received at Wed, 24 Jun 2009 18:55:00 GMT
> from j...@localhost
>
> - The following addresses had permanent fatal errors -
> bpc.20.hypa.
Inc.
> Direct: 651-604-2761
> Main: 651-636-4320
> Fax: 651-604-2781
> Cell: 651-270-3507
> www.apigroupinc.com
>
>
>
>
> On Wed, Jun 24, 2009 at 10:32 AM, Nick Smith wrote:
>>
>> ok i admit its mainly for me. id like to check my email in the morning
>&g
an email that there system hasnt
been backed up in X amount of days
some people want to know, others dont care as long as its working.
On Wed, Jun 24, 2009 at 10:52 AM, Les Mikesell wrote:
> Nick Smith wrote:
>> I would like to know if backuppc has the ability to not only send an
>
>> >
>
> It should not change anything, since the BackupPC dæmon is launched by root.
> However, most installations intentionally do not create a password for the
> backuppc user for security reasons. From root, running su -s /bin/bash
> backuppc should switch to that user without having to chang
> but switched to sata in a trayless hot-swap carrier
> when I went to larger drives.
Do you have any links to the hardware you used? I too am looking for
such a solution.
Thanks.
--
_
I would like to know if backuppc has the ability to not only send an
email if a backup hasn't complete in a set amount of time, but also
when a backup is completed?
If not, is it possible to submit a feature request?
I would really like to be able to email clients that there computers
were success
Does anyone know if i can change the password to the backuppc user in linux
and not have any adverse effects with the backuppc system?
On Tue, Jun 23, 2009 at 3:17 PM, Nick Smith wrote:
> On Tue, Jun 23, 2009 at 2:46 PM, Matthias Meyer wrote:
>> Nick Smith wrote:
>>>
>>&
On Tue, Jun 23, 2009 at 2:46 PM, Matthias Meyer wrote:
> Nick Smith wrote:
>>
>> Ok, new problem, i cant get the backuppc user to work with shared keys.
>> Ive added the /var/lib/backuppc/.ssh/id_rsa.pub contents to the other
>> linux box's
> in /root/.ssh I as
> The root setup on the backuppc server side is irrelevant. You may have
> set it up in addition to the correct backuppc setup. You don't need it.
>
>> The command that backuppc is running in the config is:
>> $sshPath -q -x -p 22200 -l root $host $rsyncPath $argList+
>> So i thought i had to hav
>> Communication working both directions.
>> r...@backup-server:/root/.ssh# ssh -p 22200 -l root webserver whoami
>> root
>
> You are supposed to be the backuppc user on the backuppc server when you
> do this test. You are executing it as root.
>
Whats strange is thats how i setup the first linux
Im in the process of adding some linux clients to the backuppc server.
Ive successfully setup one server on the backup server.
The second one is giving me problems.
I set it up the same way, i can ssh from one box to the other as root
with no password.
The shared keys seem to be working correctl
telling me its backed up.
thanks for the posts any other input would be much appreciated.
On Sat, Feb 7, 2009 at 10:53 AM, David Morton wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Nick Smith wrote:
>> ok, i use volume shadow so i dont need to dump.
>> what d
e before running BackupPC.
>
> Quoting "Nick Smith" :
>
>> On Fri, Feb 6, 2009 at 9:09 PM, Nick Smith wrote:
>>> I have a couple clients backup nightly over the internet.
>>> I seem to be doubting if the backups are actually being completed
>>>
I have a couple clients backup nightly over the internet.
I seem to be doubting if the backups are actually being completed successfully.
Backuppc tells me they are, but the reason im questioning is because
it is suppose
to backup 10gigs a night (their sql db changes daily) and backup
completes in
On Fri, Feb 6, 2009 at 9:09 PM, Nick Smith wrote:
> I have a couple clients backup nightly over the internet.
> I seem to be doubting if the backups are actually being completed
> successfully.
> Backuppc tells me they are, but the reason im questioning is because
> it is supp
7:53 PM, Rob Owens wrote:
> In bash, you can make your script accept arguments. See this link:
> http://www.ibm.com/developerworks/library/l-bash2.html
>
> I'm not sure if this works or not for the backuppc user, because his shell is
> sh. Maybe someone else can advise.
&g
picked
up for some reason.
any ideas?
On Mon, Feb 2, 2009 at 7:16 PM, Rob Owens wrote:
> On Mon, Feb 02, 2009 at 06:47:35PM -0500, Nick Smith wrote:
>> Is it possible to use more than on pre/post command per host?
>> Would they need to be separated by a semi-colon or comma or simila
Is it possible to use more than on pre/post command per host?
Would they need to be separated by a semi-colon or comma or similar?
Thanks for any help, i couldnt find the info in the docs or from google.
--
Create and Dep
Wow, that looks pretty impressive! I would like to try something like
that out at my site.
I had a couple questions.
Where are the variables defined? aka $user $host $3 etc?
Are they internal backuppc variables already defined or do i need to
do that somewhere?
thanks for the help, and thanks for
On Thu, Jan 22, 2009 at 12:53 PM, Nick Smith wrote:
> On Thu, Jan 22, 2009 at 11:52 AM, Les Mikesell wrote:
>> Nick Smith wrote:
>>> >
>>> Im using volume shadow to back up the databases, so they are not live.
>>> (or in use)
>>
>> Is there s
On Thu, Jan 22, 2009 at 11:52 AM, Les Mikesell wrote:
> Nick Smith wrote:
>> >
>> Im using volume shadow to back up the databases, so they are not live.
>> (or in use)
>
> Is there some reason to think the database is in a consistent state when
> the volume sha
I had that problem with a couple client servers that i back up, i
added an exclude in the
web interface that skips them,
for they key i added *
and for the BackupFilesExclude i used /Documents and Settings/*/Local
Settings/Temporary Internet Files
i also added another one for /Documents and Setting
On Wed, Jan 14, 2009 at 9:09 AM, Jeffrey J. Kosowsky
wrote:
> Kevin Kimani wrote at about 11:21:44 +0300 on Wednesday, January 14, 2009:
> > Hi all,
> >
> > Have been trying to use the script posted for copying open files in windows
> > but have not been able to fully make it run. Could someon
On Thu, Jan 22, 2009 at 9:12 AM, Les Mikesell wrote:
> Nick Smith wrote:
>>
>> in the host summary under that backup client, the "new files" says
>> size 4427.8 comp/mb 777.0 comp 82.5%
>> thats a heck of a compression ratio. i know that sql dbs can comp
I recently changed backup servers, so im starting from scratch on the
pool with a new machine.
Ive configured it just like the old one and i have the old one around
to compare from.
the problem i am having is that the backups on the new machine are
ALOT smaller than on the first.
the client has so
Is there a way to set the order in which backuppc backs up machines?
I need to make sure that 2 certain machines never start backups at the
same time.
Is this possible?
Thanks for the help.
--
This SF.net email is spons
>
> Other people have pretty much answered your questions already, but I'll
> pass on my experience with backuppc as well as an extra data point for you.
>
> I am using the older backuppc 2.1.2pl1 from Debian stable in a few places.
>
> Originally I was going to do a single full and forever increme
the pool, so it will only
> fail when this file has never been backed up by this particular
> instance on BackupPC.
>
> On 1/19/09, Nick Smith wrote:
>> I use backuppc to backup several windows servers across the internet.
>> Some of which are rather large and on slow int
I use backuppc to backup several windows servers across the internet.
Some of which are rather large and on slow internet connections.
With frequent disconnects or other random errors it has taken almost a
month to make a full backup of one client, but it finally happened
(40gigs of data, exchange
On Tue, Dec 9, 2008 at 11:01 AM, Jeffrey J. Kosowsky
<[EMAIL PROTECTED]> wrote:
> Jeffrey J. Kosowsky wrote at about 15:50:45 -0500 on Sunday, December 7, 2008:
> > Some of my WinXP backups occasionally fail with the generic LOG error
> > message:
> > 12:00:30 Started incr backup on
I have some clients that take 3 days or better to do a full backup, i
would like to do 1 full backup a month, and 1 incr backup every day,
and keep 1 week worth of incr and only 1 full backup. What happens
when there is still a full backup going on and i schedule and incr
everyday? will it wait u
Is there a way to clear out all backups on the system? Ive been
testing with backuppc, and want to start from scratch again. I want
to keep the hosts and scripts ive setup, but i want to clear out all
backed up files and start over. If i remove the "pc" and
/var/lib/backuppc/cpool directorys wil
> Within the Key you can add the directories relativ to the RsyncShareName in
> Linux syntax.
> e.g.:
> $Conf{RsyncShareName} = [
> 'D',
> 'C'
> ];
> $Conf{BackupFilesExclude} = {
> 'C' => [
>'/WINDOWS/Downloaded Program Files',
>'/WINDOWS/Offline Web Pages',
>'/WINDOWS/Temp',
>'
>
> I declare the exclude list within the GUI of backuppc.
>
> Declare the "RsyncShareName" in the same manner as they are declared in your
> rsyncd.conf in your windows client.
> Define BackupFilesExclude:
> NewKey = "*" if it should applicable to all RsyncShareName or the
> RsyncShareName to whic
I currently have several windows server backup clients that i use
volume shadow to backup the data, i use a pre script that lauches the
shadow and maps it to drive B on the windows box and then backuppc
backups B over rsync then a post script to kill the volume shadow.
What i would like to do is cr
On Thu, Dec 4, 2008 at 10:55 PM, Adam Goryachev
<[EMAIL PROTECTED]> wrote:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
>>> Rsync added a TCP keep-alive option in protocol version 29
>>> (if I recall correctly) and is not currently supported in
>>> File::RsyncP that BackupPC uses.
>>
>> Is t
On Thu, Nov 20, 2008 at 8:28 PM, Craig Barratt
<[EMAIL PROTECTED]> wrote:
> James writes:
>
>> The problem we are seeing is that Backups are randomly failing.
>> The log file on BackupPC showing something like this:
>
> This is most likely a TCP timeout or other network problem.
>
> Rsync added a T
2008/11/18 James Sefton <[EMAIL PROTECTED]>:
> Hi,
>
>
>
> Please excuse me if I am using this wrong, in all my years in IT, it seems
> this is the first time I have used a mailing list for support. (I'm usually
> pretty good at the whole RTFM thing)
>
>
Did you ever get this resolved? Im having
hu, Oct 9, 2008 at 11:06 AM, Nick Smith <[EMAIL PROTECTED]> wrote:
>> I am using the volume shadow copy to backup large (12gig+) sql db's.
>> After the first full backup, and things are changed/added to the DB,
>> is it going to pull down the entire DB again or will it jus
I am using the volume shadow copy to backup large (12gig+) sql db's.
After the first full backup, and things are changed/added to the DB,
is it going to pull down the entire DB again or will it just download
the changes
(if thats possible)
thanks for the input.
---
On Tue, Oct 7, 2008 at 12:54 PM, Nick Smith <[EMAIL PROTECTED]> wrote:
> one thing i use that i recently figured out is the lsof command in
> linux, it lists all the open files that the system has open.
> so i run a command like lsof | grep /storage/backuppc and it will list
Im running into the same thing on ubuntu, from what ive read it could
be a timeout issue, ive set my time out extremely high to see if i can
get it to work, trying to backup a 12gig DB over the internet when i
get this error. (actually trying to get the entire machine which
includes the DB) so a t
Ive been running backuppc for a couple weeks now with mixed results,
mainly network speed issues which really isnt backuppc's fault, but
ive come up with some questions along the way.
what happens when a backup hasnt finished by the blackout period of
the next day?
how come when i start a full ba
one thing i use that i recently figured out is the lsof command in
linux, it lists all the open files that the system has open.
so i run a command like lsof | grep /storage/backuppc and it will list
all the files currently open and filter them by the path
that my backups go to, because from what iv
Ive been playing with different options with using backuppc to backup
windows servers remotely over the internet, ive found a very good
how-to on a blog that seems like a very nice solution, but im having
problems at every step of the way and im hoping you guys can answer a
couple questions for me.
48 matches
Mail list logo