Mike,
Thanks for the follow up.
In looking at the gnutar source code, the default tar format can set to
something other than gnu. One example is that if you set the --pax-option,
then the output format is set to posix. You can run "tar --show-defaults"
to see the compile-time defaults.
Craig
-
Craig,
It's been some time since my last report. For the record want to
let you know that the issue recently is solved.
The error 36096 went away and came back once and awhile. As it
reappeared some weeks ago, I did some investigation and testing.
Fi
> My tar version is 1.22 (2009) as far as I could test, it supports
> file and link-names longer than 100 characters. It's part of
> ubuntu server 11.04.
GNU tar has half a dozen strategies for dealing with filenames over 100
characters. By default, the longfile parameter is se
On 06/28/2013 07:43 AM, Craig Barratt
wrote:
Mike,
Is there
a way to increase the debug reporting level to get more
clues?
Les' suggestions are a good way to see if it is a
problem
There were new files, I created one to force the system to back it up.
It did not crash on that particular file, but gave the error message
(with the cut-off file name). I tried to make subdirectories on the
backup disk. No problem.
Filesystem is ext4.
On 06/27/2013 10:01 PM, Les Mikesell wrote
Mike,
Is there a way to increase the debug reporting level to get more clues?
Les' suggestions are a good way to see if it is a problem on the source
file system.
For BackupPC, first try to limit the backup to the smallest case that shows
the problem (by excluding files or creating a simple tes
On Thu, Jun 27, 2013 at 2:48 PM, Mike Bosschaert wrote:
>
> For some reason I CAN make incremental backups (which do report errors on
> the long directory names). But the process does not crash.
That probably just means there aren't any new files to write there.
> One detail which may be importa
Thanks Craig and Stephen for taking the
time to dive into this, and excuse for my late reply (have been
out of town, could not access the backupserver).
My tar version is 1.22 (2009) as far as I could test, it supports
file and link-names longer than 100 ch
>
> I find it interesting that the client file path is being cut off ~= 100
> characters.
That's a very good observation and an important clue. The file header in
any tar file is limited to 100 characters, and there is a special extension
(basically another dummy file header with a payload conta
I think you've eliminated the remote and local filesystems and said you
have no problem backing up via smb or backing up local files.
I find it interesting that the client file path is being cut off ~= 100
characters.
echo
/fhome/fmike/fwerk/fsystem/fTransformer/fandroid-sdk-linux/fadd-ons/faddon
> There are a few clues here that may help indicate what's wrong; I assume
> you've tried it from the command line? What were the results?
Not really tested commandline. The error appears constantly after approx
75 minutes of backing up. Haven't tried yet.
>> Hi, since some time I cannot make fu
On Mon, Jun 24, 2013 at 8:58 AM, Michael Stowe
wrote:
>
>>
>> Running: ssh -t -l backuppc xxx.xxx.xxx.xxx sudo LANGUAGE=EN /bin/tar
>> --ignore-failed-read -c -v -f - -C / --totals --exclude=.Cache/
>> --exclude=.cache/ --exclude=Cache/ --exclude=cache/
>> --exclude=lost+found ./home/www ./home/my
There are a few clues here that may help indicate what's wrong; I assume
you've tried it from the command line? What were the results?
> Hi, since some time I cannot make full backups anymore of one of my PC's
> using tar over ssh. Could not find any reference on the mailing lists to
> this type
Hi, since some time I cannot make full backups anymore of one of my PC's
using tar over ssh. Could not find any reference on the mailing lists to
this type of error.
I'm running BackupPC 3.1.0 on a debian system. This error appeared after a
long time of stable performance. Backing up another PC thr
14 matches
Mail list logo