Hey Michael,
There is a discussion on TIMESTAMP semantics going on the thread "SQL
TIMESTAMP semantics vs. SPARK-18350" which might impact Spark 2.2. Should
we make a decision there before voting on the next RC for Spark 2.2?
Thanks,
Kostas
On Tue, May 30, 2017 at 12:09 PM, Michael Armbrust
>From both this and the JDK thread, I've noticed (including myself) that
people have different notions of compatibility guarantees between major and
minor versions.
A simple question I have is: What compatibility can we break between minor
vs. major releases?
It might be worth getting on the same
Also, +1 on dropping jdk7 in Spark 2.0.
Kostas
On Mon, Mar 28, 2016 at 2:01 PM, Marcelo Vanzin wrote:
> Finally got some internal feedback on this, and we're ok with
> requiring people to deploy jdk8 for 2.0, so +1 too.
>
> On Mon, Mar 28, 2016 at 1:15 PM, Luciano Resende
In addition, with Spark 2.0, we are throwing away binary compatibility
anyways so user applications will have to be recompiled.
The only argument I can see is for libraries that have already been built
on Scala 2.10 that are no longer being maintained. How big of an issue do
we think that is?
If an argument here is the ongoing build/maintenance burden I think we
should seriously consider dropping scala 2.10 in Spark 2.0. Supporting
scala 2.10 is bigger build/infrastructure burden than supporting jdk7 since
you actually have to build different artifacts and test them whereas you
can
Hello all,
I'd like to close out the discussion on SPARK-13843 by getting a poll from
the community on which components we should seriously reconsider re-adding
back to Apache Spark. For reference, here are the modules that were removed
as part of SPARK-13843 and pushed to:
I'd also like to make it a requirement that Spark 2.0 have a stable
dataframe and dataset API - we should not leave these APIs experimental in
the 2.0 release. We already know of at least one breaking change we need to
make to dataframes, now's the time to make any other changes we need to
/APIs stabilized will be very beneficial. This might
make Spark 1.7 a lighter release but that is not necessarily a bad thing.
Any thoughts on this timeline?
Kostas Sakellis
On Thu, Nov 12, 2015 at 8:39 PM, Cheng, Hao <hao.ch...@intel.com> wrote:
> Agree, more features/apis/optimiza
I know we want to keep breaking changes to a minimum but I'm hoping that
with Spark 2.0 we can also look at better classpath isolation with user
programs. I propose we build on spark.{driver|executor}.userClassPathFirst,
setting it true by default, and not allow any spark transitive dependencies
+1 on a lightweight 2.0
What is the thinking around the 1.x line after Spark 2.0 is released? If
not terminated, how will we determine what goes into each major version
line? Will 1.x only be for stability fixes?
Thanks,
Kostas
On Tue, Nov 10, 2015 at 3:41 PM, Patrick Wendell
10 matches
Mail list logo