"Stewart, Louis (IS)" <louis.stew...@ngc.com> writes:

> Can GIT handle versioning of large 20+ GB files in a directory?

I think you can "git add" such files, push/fetch histories that
contains such files over the wire, and "git checkout" such files,
but naturally reading, processing and writing 20+GB would take some
time.  In order to run operations that need to see the changes,
e.g. "git log -p", a real content-level merge, etc., you would also
need sufficient memory because we do things in-core.


--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to