Hi everyone:

I know this is possibly something of a fool's errand, but I'm hoping
someone has come up with some magic tool or process for more-easily
cleaning up file storage than going through 12 years of files one-by-one.

As part of our DAMS project, I've run some TreeSize Pro scans on three of
the 20-25 or so network storage directories. Just in those three, there are
approximately 66,467 duplicate files. We initially thought about creating
hardlinks for the duplicates, which will at least help the server access
files more efficiently, but it won't solve the problem of actually having
files all over the place that the DAMS will ultimately ingest.

Another thought was to do symlinks, but as far as I know, there aren't easy
tools to automagically create these for Windows desktops or servers. Plus,
it might create havoc for all of the file permissions.

So does anyone have any other ideas that I might try? Or are we really just
stuck with all of this junk until someone manually goes in and cleans it up?

Thanks,

~Perian
_______________________________________________
You are currently subscribed to mcn-l, the listserv of the Museum Computer 
Network (http://www.mcn.edu)

To post to this list, send messages to: mcn-l@mcn.edu

To unsubscribe or change mcn-l delivery options visit:
http://mcn.edu/mailman/listinfo/mcn-l

The MCN-L archives can be found at:
http://www.mail-archive.com/mcn-l@mcn.edu/

Reply via email to