Hi
I use gradle and I don't think it really has "provided" but I was able to google
and create the following file but the same error still persist.
group 'com.company'version '1.0-SNAPSHOT'
apply plugin: 'java'apply plugin: 'idea'
repositories {mavenCentral()mavenLocal()}
configurations {
Seems the runtime Spark is different from the compiled one. You should
mark the Spark components "provided". See
https://issues.apache.org/jira/browse/SPARK-9219
On Sun, Oct 9, 2016 at 8:13 PM, kant kodali wrote:
>
> I tried SpanBy but look like there is a strange error that happening no
> matt
Hi Reynold,
Actually, I did that a well before posting my question here.
Thanks,kant
On Sun, Oct 9, 2016 8:48 PM, Reynold Xin r...@databricks.com
wrote:
You should probably check with DataStax who build the Cassandra connector for
Spark.
On Sun, Oct 9, 2016 at 8:13 PM, kant kodali wrote:
You should probably check with DataStax who build the Cassandra connector
for Spark.
On Sun, Oct 9, 2016 at 8:13 PM, kant kodali wrote:
>
> I tried SpanBy but look like there is a strange error that happening no
> matter which way I try. Like the one here described for Java solution.
>
> http:/
I tried SpanBy but look like there is a strange error that happening no matter
which way I try. Like the one here described for Java solution.
http://qaoverflow.com/question/how-to-use-spanby-in-java/
java.lang.ClassCastException: cannot assign instance of
scala.collection.immutable.List$Serializ