I would not recommend doing so, because it may affect Ignite performance,
but you can tweak JVM to use less memory and return it to OS more
frequently like this:
var cfg = new IgniteConfiguration
{
ClientMode = true,
JvmOptions = new[]{"-XX:MaxHeapFreeRatio=30", "-XX:MinHeapFreeRatio=10"},
Ok yes i see. Seems like with my code changes I made to provide the example
that the memory consumption is way more inline with expectations, so I guess
it was a code error on my part.
However, it seems strange that my client node, which has no cache, still
wants to hang onto over 1Gb of heap spac
I've ran your code under .NET and Java memory profilers.
In short, everything is working fine - nothing to worry about.
*DotMemory*:
[image: image.png]
.NET managed memory usage is under 1Mb, unmanaged memory is much higher -
that is what Java part allocates.
*jvisualvm*:
[image: image.png]
(I'
In my sample code i had a bit of a bug, this should be the line to add:
var _ = ldr.AddData(id++,data);
However it doesnt appear to make any difference, this is the state of memory
(with ignite.exe being my client executable). This is paused after insertion
of 1 million rows, why is my client mem
Here is my source file and a 1 million row CSV file.
I am not sure whats different between my code and yours but my version
quickly consumes memory on the client side for some reason.
Caveat, I am normally a Python programmer so i might have missed something
obvious...
https://wtwdeeplearning.bl
> Since we're in 2019, we don't recommend running any Ignite nodes with
-Xmx2G (that is, 2 gigabytes of heap allowance
Does 2019 somehow allow us to consume 2Gb for nothing?
I don't think a client node needs that much.
Let's see a reproducer.
My testing shows that streaming works out of the box o
Hello!
Since we're in 2019, we don't recommend running any Ignite nodes with
-Xmx2G (that is, 2 gigabytes of heap allowance).
It is certainly possible to run Ignite with less heap, but the reasoning of
such is not very clear.
Please also note that our JDBC thin driver supports streaming, and it
Hi!
If each row is stored as an entry in the cache you can expect an
overhead of around 200 byte per entry, so 200MB just for the actual
entries (1M) not counting your data (more if you have any index).
You can control the streamer, how much data and when it should be
flushed, I have no idea
Here is what I tried:
https://gist.github.com/ptupitsyn/7dacefd1cebb936d5f516d8afeba7efe
Ran for a minute or so, 200Mb used on client, 5Gb on server, seems to work
as expected to me.
On Thu, Nov 14, 2019 at 2:14 PM Pavel Tupitsyn wrote:
> Sounds nasty, can you share a reproducer please?
>
> On
Sounds nasty, can you share a reproducer please?
On Thu, Nov 14, 2019 at 10:12 AM camer314
wrote:
> I have a large CSV file (50 million rows) that i wish to upload to a
> cache. I
> am using .NET and a DataStreamer from my application which is designated as
> a client only node.
>
> What i dont
I have a large CSV file (50 million rows) that i wish to upload to a cache. I
am using .NET and a DataStreamer from my application which is designated as
a client only node.
What i dont understand is i quickly run out of memory on my C# streaming
(client) application while my data node (an instanc
11 matches
Mail list logo