Hi,

Reconsidering the execution model behind Streaming would be a good
candidate here, as Spark will not be able to provide the low latency and
sophisticated windowing semantics that more and more use-cases will
require. Maybe relaxing the strict batch model would help a lot. (Mainly
this would hit the shuffling, but the shuffle package suffers from
overlapping functionalities, lack of good modularity anyway. Look at how
coalesce implemented for example - inefficiency also kicks in there.)

On Wed, Nov 11, 2015 at 12:48 PM Tim Preece <tepre...@mail.com> wrote:

> Considering Spark 2.x will run for 2 years, would moving up to Scala 2.12 (
> pencilled in for Jan 2016 ) make any sense ? - although that would then
> pre-req Java 8.
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/A-proposal-for-Spark-2-0-tp15122p15153.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to