Hi  Nate thanks much. I have exact same use cases mentioned by you. My
spark job does heavy writing involving  group by and huge data shuffling.
Can you please provide any pointer how can I run my existing spark job
which is running on yarn to make it run on ignite? Please guide. Thanks
again.
On Jan 6, 2016 02:28, <n...@reactor8.com> wrote:

> We started playing with Ignite back Hadoop, hive and spark services, and
> looking to move to it as our default for deployment going forward, still
> early but so far its been pretty nice and excited for the flexibility it
> will provide for our particular use cases.
>
> Would say in general its worth looking into if your data workloads are:
>
> a) mix of read/write, or heavy write at times
> b) want write/read access to data from services/apps outside of your spark
> workloads (old Hadoop jobs, custom apps, etc)
> c) have strings of spark jobs that could benefit from caching your data
> across them (think similar usage to tachyon)
> d) you have sparksql queries that could benefit from indexing and
> mutability
> (see pt (a) about mix read/write)
>
> If your data is read exclusive and very batch oriented, and your workloads
> are strictly spark based, benefits will be less and ignite would probably
> act as more of a tachyon replacement as many of the other features outside
> of RDD caching wont be leveraged.
>
>
> -----Original Message-----
> From: unk1102 [mailto:umesh.ka...@gmail.com]
> Sent: Tuesday, January 5, 2016 10:15 AM
> To: user@spark.apache.org
> Subject: Spark on Apache Ingnite?
>
> Hi has anybody tried and had success with Spark on Apache Ignite seems
> promising? https://ignite.apache.org/
>
>
>
> --
> View this message in context:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-on-Apache-Ingnite-
> tp25884.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
> commands, e-mail: user-h...@spark.apache.org
>
>
>

Reply via email to