Spark actually used to depend on Akka. Unfortunately this brought in
all of Akka's dependencies (in addition to Spark's already quite
complex dependency graph) and, as Todd mentioned, led to conflicts
with projects using both Spark and Akka.

It would probably be possible to use Akka and shade it to avoid
conflicts (some additional classloading tricks may be required).
However, considering that only a small portion of Akka's features was
used and scoped quite narrowly across Spark, it isn't worth the extra
maintenance burden. Furthermore, akka-remote uses Netty internally, so
reducing dependencies to core functionality is a good thing IMO

On Mon, May 23, 2016 at 6:35 AM, Todd <bit1...@163.com> wrote:
> As far as I know, there would be Akka version conflicting issue when  using
> Akka as spark streaming source.
>
>
>
>
>
>
> At 2016-05-23 21:19:08, "Chaoqiang" <hcq20160...@aliyun.com> wrote:
>>I want to know why spark 1.6 use Netty instead of Akka? Is there some
>>difficult problems which Akka can not solve, but using Netty can solve
>>easily?
>>If not, can you give me some references about this changing?
>>Thank you.
>>
>>
>>
>>--
>>View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/why-spark-1-6-use-Netty-instead-of-Akka-tp27004.html
>>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>>---------------------------------------------------------------------
>>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>For additional commands, e-mail: user-h...@spark.apache.org
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to