Hi Arno,

Arno Lehmann wrote:
>> Investigating, it looks like the issue is hitting the integer limit on 
>> the fileid:
>>
>> bacula=> select max(fileid) from file;
>>      max
>> ------------
>>   2147483647
>> (1 row)
> 
> Looks like that.
> 
>> Have other people encountered this? We've been happy little bacula users 
>> for about a year now, and we are backing up a bit of data each night 
>> (~300GB), but I can't believe we're that big a site in the scheme of things?
> 
> Which catalog database do you run?

Postgresql.

>> More immediately: is altering fileid to a bigint to workaround this a 
>> sane thing to do? Do I have any other options?
> 
> I don't know, I don't have any system with such a high number of 
> File-table entries at hand :-)
> 
> But, as far as I know, Bacula itself can handle 64-bit integers, so 
> the change should be ok. Just make sure you have a valid catalog dump 
> before you try it :-)

Tried it, since backups weren't working otherwise. :-)

So far all looks fine and I've had a couple of nightlies complete 
successfully now. So for the archives looks like this is a reasonable 
fix if you do hit this limit.

I guess the reason the schema doesn't use bigints by default is just the 
extra storage hit when most people don't seem to need it?

Thanks for your help.

Cheers,
Gavin


-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to