Re: out of memory
Rahul wrote: One thing you could try is to recompile CVS without HAVE_MMAP flag. On a 32 bit system memory mapped files will hit the 2GB limit. By default CVS uses mmap'd files for faster performance. That said CVS could avoid loading entire file in memory and work off segements when doing common operation. However that would be a longer term fix not something you can do today. Regards, Rahul Bhargava, CTO, WANdisco Mountain View, CA http://www.wandisco.com/cvs Thanks so much Rahul for your advice. I will definetelly keep this in mind. For the time being, I had edited the header of some tar files and basically made the files ver 1.1 as I had added them prior to creating the branches. I had also inserted both branch Id's in the symbol section and voila, I have the files available on the trunk + 2 branches now. I might have lost a bit of info on them, but did not care much since they were not released by QA as GA'd products, yet. Regards, Cristian ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
Re: out of memory
So I guess my questions are: 1. Assuming that we need more memory, why not all of the swap is used? I am not trying to say that there's something wrong with 'cvs', rather just trying to understand the problem. On a 32 bits OS, you rarely can allocate more than 2GB per process for structural reasons of the OS itself (shared libraries, code, stack... are generally placed at an arbitrary place)... maybe with Linux64 plus a 64 bits build of CVS you could fix your problem. The swap has nothing to do with that sadly, the OS will not let you allocate the memory you have (physical or swap) in a single process if you have more than 2GB or so (maybe it is a bit more if Linux is better organized than Windows) wish it helps Armel ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
Re: out of memory
Armel Asselin wrote: So I guess my questions are: 1. Assuming that we need more memory, why not all of the swap is used? I am not trying to say that there's something wrong with 'cvs', rather just trying to understand the problem. On a 32 bits OS, you rarely can allocate more than 2GB per process for structural reasons of the OS itself (shared libraries, code, stack... are generally placed at an arbitrary place)... maybe with Linux64 plus a 64 bits build of CVS you could fix your problem. The swap has nothing to do with that sadly, the OS will not let you allocate the memory you have (physical or swap) in a single process if you have more than 2GB or so (maybe it is a bit more if Linux is better organized than Windows) wish it helps Armel Thanks so much Armel, it did help inded. I have done some tests in fidling with the header of the ,v file in order to make it behave just like it was checked in on the trunk at ver 1.1 and branched together with all the other files. After a couple of tries I managed to do it in a test environment and now I am trying to do it on the 'real' server. I've got 8 files in this situation. It's not something I would like to do on a daily basis, that is for sure. Thanks, Cristian ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
Re: out of memory
One thing you could try is to recompile CVS without HAVE_MMAP flag. On a 32 bit system memory mapped files will hit the 2GB limit. By default CVS uses mmap'd files for faster performance. That said CVS could avoid loading entire file in memory and work off segements when doing common operation. However that would be a longer term fix not something you can do today. Regards, Rahul Bhargava, CTO, WANdisco Mountain View, CA http://www.wandisco.com/cvs ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
out of memory
Hello, I am trying to do a merge from one branch to another. Among other changes, on the changed branch I have added a large, 1 GB file (packaged oracle -- don't ask). It is a binary type file -kb. During the merge, all of the RAM and some SWAP is taken and then an out of memory exception is presented. But I do have enough RAM/SWAP to hold the contents of the file. I even have enough to hold a duplicate of the file since a binary type file will not be merged, rather copied. I am now thinking that this thing might need 4 times the file size, 1 for the original repo file, 1 for the existing file you are tring to add, and 2 for holding the contents of the concatenated files. But for some reason, only a bit of swap gets used, I have 3 GB and then the out of memory is thrown. OS - Linux scmcvs 2.4.21-27.0.1.ELsmp #1 SMP Mon Dec 20 18:47:45 EST 2004 i686 i686 i386 GNU/Linux CVS -- Concurrent Versions System (CVS) 1.11 (client/server) * FILE IN QUESTION: --- $ ls -l oracle10g.tar -rw-r--r--1 cvsuser cvs 1176514560 Jun 26 16:29 oracle10g.tar TOP CMD B4 MERGE --- Mem: 3082252k av, 424824k used, 2657428k free, 0k shrd, 47296k buff 280500k active, 63560k inactive Swap: 2950204k av, 0k used, 2950204k free237580k cached MERGE CMD cvs -q up -dP -j REL-4_4_0-LAST-MERGED-TAG -j REL-4_4_0-BRANCH ERROR MSG snip cvs [update aborted]: out of memory; can not allocate 1176514561 bytes TOP CMD @ TIME OF ERROR MSG - Mem: 3082252k av, 3064420k used, 17832k free, 0k shrd, 14420k buff 2418788k actv, 460520k in_d, 47236k in_c Swap: 2950204k av, 225456k used, 2724748k free522232k cached TOP CMD 10 secs AFTER ERROR MSG - Mem: 3082252k av, 2047188k used, 1035064k free, 0k shrd, 8308k buff 1532648k actv, 341764k in_d, 48860k in_c Swap: 2950204k av, 231060k used, 2719144k free487084k cached TOP CMD 30 secs LATER Mem: 3082252k av, 619780k used, 2462472k free, 0k shrd, 8404k buff 134164k actv, 323716k in_d, 45560k in_c Swap: 2950204k av, 116k used, 2950088k free487072k cached So I guess my questions are: 1. Assuming that we need more memory, why not all of the swap is used? I am not trying to say that there's something wrong with 'cvs', rather just trying to understand the problem. 2. Is there anything that I can do on the repo side (,v file) to make this tar file available on the 2-nd branch and to the trunk? Assuming I can edit the file :-) I would really appreciate any help on this. Thanks a lot, Cristian ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
Re: 2GB limitation: out of memory problem.
Hi - Few thoughts on overcoming hardwre limitations. Not related to just solving virtual memory issues. If you are hitting hard limits on a box one other way out is to have a cluster of CVS repositories in your LAN. You could then distribute the load across CVS repositories. This could be done using a hardware switch that just load-balances the DNS requests or even TCP connections from CVS clients. That way you are unlikely to have tons of concurrent users accessing the same physical box. Performance from each box will also improve. You can do all this using the WANdisco CVS Replicator. It works just as well in the LAN as WAN. See http://www.wandisco.com/cvs Best Regards, Rahul Bhargava CTO, WANdisco ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
Re: 2GB limitation: out of memory problem.
Paul writes: During a CVS commit I got the following error message: cvs [server aborted]: out of memory; can not allocate x bytes You need to increase the amount of (virtual) memory available to the CVS server. Exactly how to do that is highly system dependent. -Larry Jones I don't need to do a better job. I need better P.R. on the job I DO. -- Calvin ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
2GB limitation: out of memory problem.
During a CVS commit I got the following error message: cvs [server aborted]: out of memory; can not allocate x bytes The binary file is not that large, it was commited multiple times before, making it a large file to commit. Therefore I decided to remove older versions (with the -orange option), but no success: same out of memory problem. Does anyone know how to solve this memory limitation, making it possible to remove the older versions? Thanks in advance, Paul ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
Re: Error out of memory when checking out large binary
Tom Simons writes: When checking out a directory with large binary files, cvs fails with out of memory. How can we get around this? Don't store large binary files in a source control system. :-) You need to make more virtual memory available on the server. How you do that is, of course, system specific. On Unix-like systems, you need to make more swap space available. -Larry Jones It's SUSIE! It's a GIRL! Santa would understand! -- Calvin ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
Error out of memory when checking out large binary
When checking out a directory with large binary files, cvs fails with out of memory. How can we get around this? $ cvs checkout PKI/WFCMS/robodemo/src cvs server: Updating PKI/WFCMS/robodemo/src U PKI/WFCMS/robodemo/src/Browser_Based.rd U PKI/WFCMS/robodemo/src/Exchange5.5_Browser.rd cvs [server aborted]: out of memory; can not reallocate 31457280 bytes Here's what the directory looks like in the repository: [cvs]$ ll PKI/WFCMS/robodemo/src -r--r--r-- 1 kalaiahp cvs16866839 Dec 17 17:46 Browser_Based.rd,v -r--r--r-- 1 kalaiahp cvs19576910 Dec 17 17:46 Exchange5.5_Browser.rd,v -r--r--r-- 1 kalaiahp cvs32187968 Dec 17 17:47 Exchange5.5_P12.rd,v -r--r--r-- 1 kalaiahp cvs6624815 Dec 17 17:47 Management.rd,v -r--r--r-- 1 kalaiahp cvs1847691 Dec 17 17:47 Outlook_Browser.rd,v -r--r--r-- 1 kalaiahp cvs2748271 Dec 17 17:47 Outlook_P12.rd,v -r--r--r-- 1 kalaiahp cvs13262220 Dec 17 17:47 WFCMS_P12.rd,v -r--r--r-- 1 kalaiahp cvs21998898 Dec 17 17:48 WLAN.rd,v ___ Info-cvs mailing list Info-cvs@gnu.org http://lists.gnu.org/mailman/listinfo/info-cvs
cvs update out of memory error
I am checking a directory via pserver remotely and it ran into an error on my linux host which has smaller amount of memory resource: cvs [server aborted]: out of memory; can not reallocate 50331648 bytes that directory contains severals tar files such as 75141120 Mar 9 16:53 ew53dataM.tar I tried to use CVSIGNORE to ignore all *.tar file but it seems it only does that if I am checking in a file. My question is how I can bypass that directory as a whole or set up some options to allocate more memory for CVS. Thanks ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Out of memory on AIX - works on Linux
Hello, I have a Linux machine (RH9) with CVS 1.11.2 on it, and I am able to check in/out large files (~70MB). But when I try to checkout the same on an AIX machine which has cvs 1.11.9 on it, the client just ends up with the following message : - cvs [update aborted]: out of memory; can not allocate 335 bytes - Is there some system setting, like a minimum vm size, or something similar which I should set on the AIX machine to get cvs to serve the large file ? Any suggestions ? Thanks in advance. Sri. __ Do you Yahoo!? The New Yahoo! Shopping - with improved product search http://shopping.yahoo.com ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Out of memory: CVS abort error
I have a user that was trying to delete a 64 MB file from one of our repositories and he got an error stating that CVS aborted because it was Out of Memory. Is there any limit on the size of files that can be deleted or manipulated within CVS? Does it just kind of bomb out at a certain threshold or does it need a certain amount of memory to hold the whole file while deleting it? Any help would be appreciated. Thanks, Scott ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: Out of memory: CVS abort error
Scott O. writes: I have a user that was trying to delete a 64 MB file from one of our repositories and he got an error stating that CVS aborted because it was Out of Memory. Is there any limit on the size of files that can be deleted or manipulated within CVS? Does it just kind of bomb out at a certain threshold or does it need a certain amount of memory to hold the whole file while deleting it? Any help would be appreciated. There are no internal limits in the code, you're limited only by the amount of available (virtual) memory. You need to add more swap space and/or increase any per-process or sytem-wide limits on the amount of virtual memory a process is allowed to use. -Larry Jones It's no fun to play games with a poor sport. -- Calvin ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: cvs [commit aborted]: out of memory; can not reallocate 95683558 bytes
Sachin wrote: Hi, I am trying bytes to checkin a big file of 95MB into a repository. I am woking on a HP-UX 10.20 system and CVS version is 1.11.1p. I am getting the following error messages, cvs [commit aborted]: out of memory; can not reallocate 95683558 TIA. First: Are you really sure you need to put that file under revision control? What is it? Second: The error message tells it all: cvs runs out of memory when it tries to swallow such a huge file. I don't know exactly where you've to add memory (are you working client/server or local?), but I guess at least you've to increase your swapspace. You may also run into trouble with insufficient temporary disk space. Have a look at http://www.cvshome.org/docs/manual/cvs_2.html#SEC37 Harald -- iXpoint Informationssysteme GmbH # Rheinstraße 79a # Harald Kucharek 76275 Ettlingen # [EMAIL PROTECTED] Tel/Fax +49 7243 3775-0/77# www.ixpoint.de ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: cvs [commit aborted]: out of memory; can not reallocate 95683558 bytes
hi, Thanks for the info. I have checked that I have enough of swap and tmp space free. Inspite of that it gives me an error. I have tried checking in a file size of max 60 MB. After that it cribs and starts giving the memory error. there is some limitation from cvs side itself. any more help would b appreciated. thanks sachin Harald Kucharek [EMAIL PROTECTED] wrote in message news:[EMAIL PROTECTED]... Sachin wrote: Hi, I am trying bytes to checkin a big file of 95MB into a repository. I am woking on a HP-UX 10.20 system and CVS version is 1.11.1p. I am getting the following error messages, cvs [commit aborted]: out of memory; can not reallocate 95683558 TIA. First: Are you really sure you need to put that file under revision contr ol? What is it? Second: The error message tells it all: cvs runs out of memory when it tr ies to swallow such a huge file. I don't know exactly where you've to add memory (are you working client/server or local?), but I guess at least you've to increase your swapspace. You may also run into trouble with insufficient temporary disk space. Have a look at http://www.cvshome.org/docs/manual/cvs 2.html#SEC37 Harald ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: cvs [commit aborted]: out of memory; can not reallocate 95683558 bytes
Sachin writes: Thanks for the info. I have checked that I have enough of swap and tmp space free. Inspite of that it gives me an error. I have tried checking in a file size of max 60 MB. After that it cribs and starts giving the memory error. there is some limitation from cvs side itself. any more help would b appreciated. There's no particular limit in CVS -- most likely you're running into some per-process limit. Most shells have a limit or ulimit command to display and modify the limits, but you usually have to be root to increase them. You'll have to consult your system documentation to find out how to increase the default values. -Larry Jones I'm writing you a message in code. How do you spell nincompoop? -- Calvin ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
cvs [commit aborted]: out of memory; can not reallocate 95683558 bytes
Hi, I am trying bytes to checkin a big file of 95MB into a repository. I am woking on a HP-UX 10.20 system and CVS version is 1.11.1p. I am getting the following error messages, cvs [commit aborted]: out of memory; can not reallocate 95683558 TIA. ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
out of memory????
/tmp/cvsj9Hhya 9 lines, 317 characters cvs [commit aborted]: out of memory TRIDEVL:cvsuser:/tridevl/source/trisource 2G of ram if that matters and plenty of tmp space.. i believe -rw-r- 1 cvsuser tridevl 66693120 Sep 19 13:06 ver436.tar /dev/hd3 655360593588 10% 591 1% /tmp getting an out of memory error when i try to commit this file.. any help would be appreciated. Thanks R. _ Get your FREE download of MSN Explorer at http://explorer.msn.com/intl.asp ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
RE: out of memory????
R We had a similar problem with big files. Make sure you don't have any per-process limits for memory usage on the server inherited from inetd, rsh, or the shell. I saw this most often when somebody's shell was csh, or tcsh; bash seemed to work fine. I think this was because /etc/csh.cshrc set default limits on memory whereas bash didn't. Olaf -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of r w Sent: Wednesday, September 19, 2001 8:30 PM To: [EMAIL PROTECTED] Subject: out of memory /tmp/cvsj9Hhya 9 lines, 317 characters cvs [commit aborted]: out of memory TRIDEVL:cvsuser:/tridevl/source/trisource 2G of ram if that matters and plenty of tmp space.. i believe -rw-r- 1 cvsuser tridevl 66693120 Sep 19 13:06 ver436.tar /dev/hd3 655360593588 10% 591 1% /tmp getting an out of memory error when i try to commit this file.. any help would be appreciated. Thanks R. _ Get your FREE download of MSN Explorer at http://explorer.msn.com/intl.asp ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: out of memory????
And make sure you can justify checking in a huge tar file into your version control system, since checkin/checkout/logs will be slow and diffs will probably be meaningless. dtayl Olaf Meding wrote: R We had a similar problem with big files. Make sure you don't have any per-process limits for memory usage on the server inherited from inetd, rsh, or the shell. I saw this most often when somebody's shell was csh, or tcsh; bash seemed to work fine. I think this was because /etc/csh.cshrc set default limits on memory whereas bash didn't. Olaf -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of r w Sent: Wednesday, September 19, 2001 8:30 PM To: [EMAIL PROTECTED] Subject: out of memory /tmp/cvsj9Hhya 9 lines, 317 characters cvs [commit aborted]: out of memory TRIDEVL:cvsuser:/tridevl/source/trisource 2G of ram if that matters and plenty of tmp space.. i believe -rw-r- 1 cvsuser tridevl 66693120 Sep 19 13:06 ver436.tar /dev/hd3 655360593588 10% 591 1% /tmp getting an out of memory error when i try to commit this file.. any help would be appreciated. Thanks R. _ Get your FREE download of MSN Explorer at http://explorer.msn.com/intl.asp ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: cvs [server aborted]: out of memory; can not reallocate xxx bytes
You wouldn't be running out of swap space, would you? What happens if you reduce the number of processes running on that machine, increase the size of the swap partition, and clean out any swapfs filesystems you might have? RCS has compilations options that trade high performance (memory mapped I/O or loading complete files into allocated memory) vs. small foot print (make many passes over large temporary files). I took a (very) quick look at the RCS-related code in CVS, and it appears that the RCS library does not implement this trade-off; it loads the entire RCS file into in-memory data structures. Unless this trade-off is implemented, it seems you'll have to supply sufficient swap space to process the largest of your files. --- Forwarded mail from [EMAIL PROTECTED] One of my users complained she got this message whenever she checked out or updated in her module. I found that she had committed two of her binaries; one was 11Mb, the other was 7Mb. When I removed those files, the checkout/update worked. I am running cvs 1.11 pserver with "-T /tmpcvs", where /tmpcvs is a 3Gb filesystem. There is 2Gb of RAM on the server, which is HP 10.20. /tmp is 1.1Mb (but I hope it isn't being used). The "xxx bytes" that can not be reallocated seems to be fixed depending on what file I am trying to checkout. For the file of size 11,456,355, xxx is 21968. For the file of size 7094501, xxx is 14262. Funny thing is, there are larger files that I *CAN* check out without a problem. What appears to be a distinguishing feature is that they only have a single revision in the tree (1.1). In fact, I can checkout 1.1 of the 2 files that gave me trouble above. So I'm guessing that CVS is running out of something while trying to reconstruct revision 1.4 (in the above example). Is it really memory, or could it be temp disk space (which I was hoping would be /tmpcvs, as specified on the inetd.conf command line, and not /tmp)? This might sound like a stupid question, but I thought I remember a problem with the size of the history file that also caused this message to appear. Wait. I seem to remember something about the maximum amount of memory an application is allowed to request. --- End of forwarded message from [EMAIL PROTECTED] ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
cvs [server aborted]: out of memory; can not reallocate xxx bytes
One of my users complained she got this message whenever she checked out or updated in her module. I found that she had committed two of her binaries; one was 11Mb, the other was 7Mb. When I removed those files, the checkout/update worked. I am running cvs 1.11 pserver with "-T /tmpcvs", where /tmpcvs is a 3Gb filesystem. There is 2Gb of RAM on the server, which is HP 10.20. /tmp is 1.1Mb (but I hope it isn't being used). The "xxx bytes" that can not be reallocated seems to be fixed depending on what file I am trying to checkout. For the file of size 11,456,355, xxx is 21968. For the file of size 7094501, xxx is 14262. Funny thing is, there are larger files that I *CAN* check out without a problem. What appears to be a distinguishing feature is that they only have a single revision in the tree (1.1). In fact, I can checkout 1.1 of the 2 files that gave me trouble above. So I'm guessing that CVS is running out of something while trying to reconstruct revision 1.4 (in the above example). Is it really memory, or could it be temp disk space (which I was hoping would be /tmpcvs, as specified on the inetd.conf command line, and not /tmp)? This might sound like a stupid question, but I thought I remember a problem with the size of the history file that also caused this message to appear. Wait. I seem to remember something about the maximum amount of memory an application is allowed to request. Thanks for any help! :)hal mahaffey
Re: cvs [server aborted]: out of memory; can not reallocate xxxbytes
We had this problem on a BSD box here. Turns out there were some restrictions in place on how much swap space a user program could use, and cvs was exceeding it. Our IS guy made some changes on the machine and it solved the problem. Dave on 1/9/01 2:54 PM, [EMAIL PROTECTED] at [EMAIL PROTECTED] wrote: One of my users complained she got this message whenever she checked out or updated in her module. I found that she had committed two of her binaries; one was 11Mb, the other was 7Mb. When I removed those files, the checkout/update worked. I am running cvs 1.11 pserver with "-T /tmpcvs", where /tmpcvs is a 3Gb filesystem. There is 2Gb of RAM on the server, which is HP 10.20. /tmp is 1.1Mb (but I hope it isn't being used). The "xxx bytes" that can not be reallocated seems to be fixed depending on what file I am trying to checkout. For the file of size 11,456,355, xxx is 21968. For the file of size 7094501, xxx is 14262. Funny thing is, there are larger files that I *CAN* check out without a problem. What appears to be a distinguishing feature is that they only have a single revision in the tree (1.1). In fact, I can checkout 1.1 of the 2 files that gave me trouble above. So I'm guessing that CVS is running out of something while trying to reconstruct revision 1.4 (in the above example). Is it really memory, or could it be temp disk space (which I was hoping would be /tmpcvs, as specified on the inetd.conf command line, and not /tmp)? This might sound like a stupid question, but I thought I remember a problem with the size of the history file that also caused this message to appear. Wait. I seem to remember something about the maximum amount of memory an application is allowed to request. ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: out of memory message
Paul Sander wrote: You need 1.3 GIGabytes of memory to check out this file? I can think of several workarounds: [snip] cvs [checkout aborted]: out of memory; can not reallocate 1310720 bytes That sure looks like 1.3 Mbytes to me. You can try messing with compilation options but if you can increase swap space or memory in the machine, that's probably easiest and should help maintain your current performance. Derek -- Derek Price CVS Solutions Architect ( http://CVSHome.org ) mailto:[EMAIL PROTECTED] OpenAvenue ( http://OpenAvenue.com ) -- I often speculate on why you don't return to America. Did you abscond with the church funds? Did you run off with the senator's wife? I like to think that you killed a man. It's the romantic in me. - Claude Rains as Captain Louis Renault, _Casablanca_ ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: out of memory message
Dan Allred writes: cvs [checkout aborted]: out of memory; can not reallocate 1310720 bytes Is there any way around this problem? Get more (virtual) memory. How to do this is operating system specific; you may just have to increase your limits or you might have to reconfigure the system. -Larry Jones I'm getting disillusioned with these New Years. -- Calvin ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
out of memory message
Hi, I can successfully check in very large log files into cvs but I cannot check them out. When I try to do a cvs checkout I get: cvs [checkout aborted]: out of memory; can not reallocate 1310720 bytes Is there any way around this problem? Thanks, -- Dan Allred - 512-493-8464 Netpliance - http://www.netpliance.com 7600A Capital of Texas Highway Austin, Texas 78731 ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs
Re: out of memory message
You need 1.3 GIGabytes of memory to check out this file? I can think of several workarounds: - RCS has a compilation option that limits its activities to disk, rather than mapping large files into the CPU's address space. If that option is available in CVS, try rebuilding it with that option set. - You could try increasing the size of your swap partition, adding disks as needed. - You could try reducing the size of that file, by changing its format if necessary and distributing its contents across multiple files. --- Forwarded mail from [EMAIL PROTECTED] I can successfully check in very large log files into cvs but I cannot check them out. When I try to do a cvs checkout I get: cvs [checkout aborted]: out of memory; can not reallocate 1310720 bytes Is there any way around this problem? --- End of forwarded message from [EMAIL PROTECTED] ___ Info-cvs mailing list [EMAIL PROTECTED] http://mail.gnu.org/mailman/listinfo/info-cvs