Hello

   - What is currently the best practice of deploying Ignite with Spark ?

   - Should the Ignite node sit on the same machine as the Spark executor ?


According to this documentation
<https://spark.apache.org/docs/latest/hardware-provisioning.html> Spark
should be given 75% of machine memory but what is left for Ignite then ?

In general, Spark can run well with anywhere from *8 GB to hundreds of
> gigabytes* of memory per machine. In all cases, we recommend allocating
> only at most 75% of the memory for Spark; leave the rest for the operating
> system and buffer cache.



   - Don't they battle for memory ?

   - Should i give the memory to Ignite or Spark ?

   - Would Spark even benefit from Ignite if the Ignite nodes would be
   hostet on other machines ?


We are currently having hundress of GB for analytics and we want to use
ignite to speed up things up.

Thank you

Reply via email to