i have been using spark 2.0 snapshots with some libraries build for spark
1.0 so far (simply because it worked). in last few days i noticed this new
error:

[error] Uncaught exception when running
com.tresata.spark.sql.fieldsapi.FieldsApiSpec: java.lang.AbstractMethodError
sbt.ForkMain$ForkError: java.lang.AbstractMethodError: null
    at org.apache.spark.Logging$class.log(Logging.scala:46)
    at
com.tresata.spark.sorted.PairRDDFunctions.log(PairRDDFunctions.scala:13)

so it seems spark made binary incompatible changes in logging.
i do not think spark 2.0 is trying to have binary compatibility with 1.0 so
i assume this is a non-issue, but just in case the assumptions are
different (or incompatibilities are actively minimized) i wanted to point
it out.

Reply via email to