Hi guys,
Any further interest on this scalability problem or should I move on?
Thanks,
Uri
On Thu, Nov 8, 2012 at 5:35 PM, Uri Moszkowicz u...@4refs.com wrote:
I tried on the local disk as well and it didn't help. I managed to
find a SUSE11 machine and tried it there but no luck so I think we
debug this problem?
On Thu, Nov 8, 2012 at 9:56 AM, Jeff King p...@peff.net wrote:
On Wed, Nov 07, 2012 at 11:32:37AM -0600, Uri Moszkowicz wrote:
#4 parse_object (sha1=0xb0ee98
\017C\205Wj\001`\254\356\307Z\332\367\353\233.\375P}D) at
object.c:212
#5 0x004ae9ec in handle_one_ref
used to tag every commit with CVS.
All my tags are packed so cat-file doesn't work:
fatal: git cat-file refs/tags/some-tag: bad file
On Thu, Nov 8, 2012 at 2:33 PM, Jeff King p...@peff.net wrote:
On Thu, Nov 08, 2012 at 11:20:29AM -0600, Uri Moszkowicz wrote:
I tried the patch but it doesn't
--aggressive before.
On Thu, Nov 8, 2012 at 4:11 PM, Jeff King p...@peff.net wrote:
On Thu, Nov 08, 2012 at 03:49:32PM -0600, Uri Moszkowicz wrote:
I'm using RHEL4. Looks like perf is only available with RHEL6.
Yeah, RHEL4 is pretty ancient; I think it predates the invention of
perf.
heads: 308
]
[k] clear_page_c
Does this help? Machine has 396GB of RAM if it matters.
On Thu, Nov 8, 2012 at 4:33 PM, Jeff King p...@peff.net wrote:
On Thu, Nov 08, 2012 at 04:16:59PM -0600, Uri Moszkowicz wrote:
I ran git cat-file commit some-tag for every tag. They seem to be
roughly uniformly
It all goes to pack_refs() in write_remote_refs called from
update_remote_refs().
On Tue, Oct 23, 2012 at 11:29 PM, Nguyen Thai Ngoc Duy
pclo...@gmail.com wrote:
On Wed, Oct 24, 2012 at 1:30 AM, Uri Moszkowicz u...@4refs.com wrote:
I have a large repository which I ran git gc --aggressive
That did the trick - thanks!
On Mon, Oct 22, 2012 at 5:46 PM, Andreas Schwab sch...@linux-m68k.org wrote:
Uri Moszkowicz u...@4refs.com writes:
Perhaps Git should switch to a single-file block text or binary format
once a large number of tags becomes present in a repository.
This is what
I have a large repository which I ran git gc --aggressive on that
I'm trying to clone on a local file system. I would expect it to
complete very quickly with hard links but it's taking about 6min to
complete with no checkout (git clone -n). I see the message Clining
into 'repos'... done. appear
Continuing to work on improving clone times, using git gc
--aggressive has resulted in a large number of tags combining into a
single file but now I have a large number of files in the objects
directory - 131k for a ~2.7GB repository. Any way to reduce the number
of these files to speed up clones?
I'm doing some testing on a large Git repository and am finding local
clones to take a very long time. After some investigation I've
determined that the problem is due to a very large number of tags
(~38k). Even with hard links, it just takes a really long time to
visit that many inodes. As it
...@gmail.com wrote:
On Fri, Oct 19, 2012 at 6:10 AM, Uri Moszkowicz u...@4refs.com wrote:
I'm testing out the sparse checkout feature of Git on my large (14GB)
repository and am running into a problem. When I add dir1/ to
sparse-checkout and then run git read-tree -mu HEAD I see dir1 as
expected
I'm testing out the sparse checkout feature of Git on my large (14GB)
repository and am running into a problem. When I add dir1/ to
sparse-checkout and then run git read-tree -mu HEAD I see dir1 as
expected. But when I add dir2/ to sparse-checkout and read-tree
again I see dir2 and dir3 appear and
05:53 PM, Uri Moszkowicz wrote:
I'm trying to convert a CVS repository to Git using cvs2git. I was able to
generate the dump file without problem but am unable to get Git to
fast-import it. The dump file is 328GB and I ran git fast-import on a
machine with 512GB of RAM.
fatal: Out of memory
or a clone of it is
possible at this point but breaking up the import into a few steps may
be - will try that next if this fails.
On Tue, Oct 16, 2012 at 2:18 AM, Michael Haggerty mhag...@alum.mit.edu wrote:
On 10/15/2012 05:53 PM, Uri Moszkowicz wrote:
I'm trying to convert a CVS repository to Git
Hi,
I'm trying to convert a CVS repository to Git using cvs2git. I was able to
generate the dump file without problem but am unable to get Git to
fast-import it. The dump file is 328GB and I ran git fast-import on a
machine with 512GB of RAM.
fatal: Out of memory? mmap failed: Cannot allocate
15 matches
Mail list logo