https://sourceware.org/bugzilla/show_bug.cgi?id=22831
--- Comment #13 from Luke Kenneth Casson Leighton <lkcl at lkcl dot net> --- On Wed, Mar 14, 2018 at 12:26 PM, hjl.tools at gmail dot com <sourceware-bugzi...@sourceware.org> wrote: > https://sourceware.org/bugzilla/show_bug.cgi?id=22831 > > --- Comment #12 from H.J. Lu <hjl.tools at gmail dot com> --- > (In reply to Luke Kenneth Casson Leighton from comment #11) >> (In reply to H.J. Lu from comment #10) >> there are two issues: >> >> 1. 32-bit system >> 2. 64-bit system >> >> both 32-bit and 64-bit are affected by this issue. >> >> the patch that you wrote however looks like it only addresses >> 32-bit. > > True. My patch is a starting point. I'd like to know if it helps > 32-bit system or not. If it doesn't address the issue for 32-bit > system, my approach won't for 64-bit system. unfortutely i cannot risk damaging my system by carrying out any tests (because any tests will result in a loadavg over 120 and 30 seconds later it is guaranteed to hard crash). so we will have to wait for someone else to test the patch. >> that leaves 64-bit systems still affected. > > You can always and should get more RAM for 64-bit system. i have 16 GB of DDR4 2400 mhz RAM on my laptop... and because when that system goes into swap (it has an NVMe) its loadavg goes over 120 and it is absolutely guaranteed to crash about 30 seconds later, adding more RAM is *not* the solution. however much more RAM is added, there *will* be a piece of software within 1-5 years which requires more RAM for the linker phase than any system provides. how does gcc do compilation? how does it stay within the bounds of available memory? l. -- You are receiving this mail because: You are on the CC list for the bug. _______________________________________________ bug-binutils mailing list bug-binutils@gnu.org https://lists.gnu.org/mailman/listinfo/bug-binutils