Hello Patrick, See my comments below.
Most of your questions don't have a generic answer and would heavily depend on your use case. Would you mind giving some more details about it so that I can give more specific suggestions? -Val On Thu, Sep 21, 2017 at 8:24 AM, Patrick Brunmayr < [email protected]> wrote: > Hello > > > - What is currently the best practice of deploying Ignite with Spark ? > > > - Should the Ignite node sit on the same machine as the Spark executor > ? > > Ignite can run either on same boxes where Spark runs, or as a separate cluster, and both approaches have their pros and cons. > According to this documentation > <https://spark.apache.org/docs/latest/hardware-provisioning.html> Spark > should be given 75% of machine memory but what is left for Ignite then ? > > In general, Spark can run well with anywhere from *8 GB to hundreds of >> gigabytes* of memory per machine. In all cases, we recommend allocating >> only at most 75% of the memory for Spark; leave the rest for the operating >> system and buffer cache. > > Documentation states that you should give *at most* 75% to make sure OS has a safe cushion for its own purposes. If Ignite runs along with Spark, amount of memory allocated to Spark should be less then that maximum of course. > > - Don't they battle for memory ? > > You should configure both Spark and Ignite so that they never try to consume more memory than physically available, also leaving some for OS. This way there will be no conflict. > > - > - Should i give the memory to Ignite or Spark ? > > Again, this heavily depends on use case and on how heavily you use both Spark and Ignite. > - > - Would Spark even benefit from Ignite if the Ignite nodes would be > hostet on other machines ? > > There are definitely use cases when this can be useful. Although in others it is better to run Ignite separately. > - > > > We are currently having hundress of GB for analytics and we want to use > ignite to speed up things up. > > Thank you > > > > > > > >
