When I do new byte[N], I get OutOfMemoryError, despite that the VM
claims to have more than enough free space (according to MemoryMXBean,
Runtime.freeMemory, visualvm, etc).
My working assumption is that while I have enough free memory, I don't
have enough contiguous free memory. Is there a so
ng pthread_cond_wait -> native_write_msr is taking 50% of
runtime, and not even sure where to start with that except limit the
life of any JVM to 6 hours and restart it. I kind of want to blame a
Kernel / PMU change but it only affects the JVM.
Caveat: I don't do JVM internals, I
rsive, too.
On 2/4/19 8:01 PM, Todd Lipcon wrote:
Tried looking at LogCompilation output with jitwatch? It's been helpful
for me in the past to understand why something wouldn't get jitted.
Todd
On Mon, Feb 4, 2019, 7:54 PM Shevek <mailto:goo...@anarres.org> wrote:
Update
on is used.
I'm still in "WAT?" territory.
S.
On 2/4/19 6:26 PM, Shevek wrote:
Hi,
I have a very simple routine which, on some JVMs/systems, which I have
not yet entirely narrowed down, suffers a 50x slowdown. The code is
included below.
In perf-java-flames, I see:
-> re
Hi,
I have a very simple routine which, on some JVMs/systems, which I have
not yet entirely narrowed down, suffers a 50x slowdown. The code is
included below.
In perf-java-flames, I see:
-> readVarintTable (90%), of which:
readVarintTable -> readVarint (4%)
readVarintTable -> resolve_static
This project is very much in-progress.
We need to sort about 1e13 records, several terabytes when compressed,
sort-merge, and end up with about 1e10 in sqlite. Right now, we are
running sqlite with 1e9 objects, and it isn't an issue. sqlite is much
better than one would naively believe it to b
enerate a fully sorted list, you want to take
a very small time to find the first element to pass on!
On Mon, Nov 12, 2018 at 4:57 PM Shevek <mailto:goo...@anarres.org>> wrote:
We are doing sorting by proxy. Right now I have a byte[] serialized as:
[sort-key0, data0, sort-
Dear wizards, please advise.
I need to offer a user configuration feature for pattern matching, to
exclude objects from my billion object sort-merge (which is now working
fairly well, thank you all).
What we're mostly trying to do is exclude any record which contains any
one of a number of s
Given the following code:
byte[] data = ...;
{
data = null;
data = new byte[N];
}
Is the compiler allowed to discard the assignment of null, because it's
"dead" in language terms? My argument is that it isn't dead because it
allows the garbage collector to respond to pressure within the
mber 2018 15:08:23 UTC, Shevek wrote:
Hi,
I'm trying to sort/merge a very large number of objects in Java, and
failing more spectacularly than normal. The way I'm doing it is this:
* Read a bunch of objects into an array.
* Sort the array, then merge neighbouri
resizing byte arrays.
My main disbelief is that I'm inventing all of this from scratch. It HAS
to have been done before, right? All I want to do is sort a load of
objects...!
S.
The presumed solution to this
On 11/10/18 8:51 AM, Gil Tene wrote:
On Friday, November 9, 2018 at 7:08:23
Hi,
I'm trying to sort/merge a very large number of objects in Java, and
failing more spectacularly than normal. The way I'm doing it is this:
* Read a bunch of objects into an array.
* Sort the array, then merge neighbouring objects as appropriate.
* Re-fill the array, re-sort, re-merge until
Without digging into the actual reasons, it's likely that growing a
larger arraylist is slower than growing a smaller one, and since you
never clear it, the later calls will therefore be slower. I would very
much like a benchmark of method call time to be allocation- and
garbage-free, otherwise
13 matches
Mail list logo