Must check reply-to next time...

---------- Forwarded message ----------
Hi Scott,

> I think you should probably should be looking not at TB/day but number of
> files (records) per unit time. Robinhood should report stats periodically in
> the log and this can be useful to see how your scan is going in more detail.

indeed - I see the scan averaging roughly 300-200 inserts/s (varies
depending on the activity of the changelog parser on the other
machine)

Our principal offenders are openfoam users, so for example one classic
'ZOT Offender' is

user      ,     type,      count,   spc_used,   avg_size
redacted  ,      dir,   26983856,  102.95 GB,    4.00 KB
redacted  ,     file,  101917990,    7.31 TB,  174.48 KB

Total: 128901846 entries, 8144211940352 bytes used (7.41 TB)

... which is enough to make any metadata tracking cry :-)



Re Maria vs MySQL - I don't really have much comparison - We're
running MySQL for 'production' internal databases, with replication to
a second backup server -- as I didn't want the robinhood databases
replicated, I just set up a 3rd box as a dedicated test instance, with
the secondary role of letting us see what the differences were.

------------------------------------------------------------------------------
Start Your Social Network Today - Download eXo Platform
Build your Enterprise Intranet with eXo Platform Software
Java Based Open Source Intranet - Social, Extensible, Cloud Ready
Get Started Now And Turn Your Intranet Into A Collaboration Platform
http://p.sf.net/sfu/ExoPlatform
_______________________________________________
robinhood-support mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/robinhood-support

Reply via email to