Hi,

Looks like it's a part of the current implementation. I will try to
investigate it.

However, as a workaround you can use the next code to decrease the possible
memory allocation in case if you are going to load a lot of data:

        int chunk_size = 1000;

        try (IgniteDataStreamer<Integer, Integer> streamer =
                 ignite.dataStreamer("Cache1")) {
            streamer.allowOverwrite(true);
            
            Map<Integer, Integer> entryMap = new HashMap<>();   

            // your logic where you fill the map. As example:

            for (int i = 0; i < chunk_size; i++)
                entryMap.put(i, i);

            futures.add(streamer.addData(entryMap));
        }

BR,
Andrei



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Reply via email to