Hi Guillermo

Which version of Spark are you using? Starting with Version 2.0, Spark is
built with Scala 2.11 by default. If you are using a prior version (which
looks like it's the case since your error message mention scala 2.10), you
might need to build it yourself from sources with Scala 2.11 support or to
upgrade your Spark cluster to 2.x

Cheers,

Christophe


On 26 March 2018 at 09:11, Guillermo Ortiz <konstt2...@gmail.com> wrote:

> Hello,
>
> I'm working with UDT's and spark connector with these dependencies:
>
> <scala.version>2.11.12</scala.version>
> <spark.version>2.0.2</spark.version>
> <cassandra-conector.version>2.0.7</cassandra-conector.version>
> <cassandra-driver.version>3.4.0</cassandra-driver.version>
>
>
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-core_2.11</artifactId>
> <version>${spark.version}</version>
> </dependency>
>
> <dependency>
> <groupId>org.apache.spark</groupId>
> <artifactId>spark-streaming_2.11</artifactId>
> <version>${spark.version}</version>
> </dependency>
>
>
> <dependency>
> <groupId>com.datastax.spark</groupId>
> <artifactId>spark-cassandra-connector_2.11</artifactId>
> <version>${cassandra-conector.version}</version>
> </dependency>
>
> <dependency>
> <groupId>com.datastax.cassandra</groupId>
> <artifactId>cassandra-driver-core</artifactId>
> <version>${cassandra-driver.version}</version>
> </dependency>
>
> So, with these dependencies I'm using scala 2.11, but I get this error *the 
> GettableToMappedTypeConverter which can't deserialize TypeTags due
> to Scala 2.10 TypeTag limitation. They come back as nulls and therefore
> you see this NPE.*
>
> Why do I get this error if I'm using SCALA 2.11? I want to read a 
> MAP[Int,<frozen>MyUDT>]
> from Spark with the connector. The problem it's that if theree are any field 
> which it's not setted it's not possible.
>
> If all fields are setted it works.
>
>
>
> Exception in thread "main" org.apache.spark.SparkException: Job aborted
> due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent
> failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): 
> java.lang.NullPointerException:
> Requested a TypeTag of the GettableToMappedTypeConverter which can't
> deserialize TypeTags due to Scala 2.10 TypeTag limitation. They come back
> as nulls and therefore you see this NPE.
>     at com.datastax.spark.connector.rdd.reader.
> GettableDataToMappedTypeConverter.targetTypeTag(
> GettableDataToMappedTypeConverter.scala:34)
>     at com.datastax.spark.connector.types.TypeConverter$
> AbstractMapConverter.valueTypeTag(TypeConverter.scala:707)
>     at com.datastax.spark.connector.types.TypeConverter$
> MapConverter$$typecreator45$1.apply(TypeConverter.scala:791)
>     at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$
> lzycompute(TypeTags.scala:232)
>     at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
>     at com.datastax.spark.connector.types.TypeConverter$class.
> targetTypeName(TypeConverter.scala:36)
>     at com.datastax.spark.connector.types.TypeConverter$
> CollectionConverter.targetTypeName(TypeConverter.scala:682)
>     at com.datastax.spark.connector.rdd.reader.
> GettableDataToMappedTypeConverter.tryConvert(
> GettableDataToMappedTypeConverter.scala:156)
>



-- 

*Christophe Schmitz - **VP Consulting*

AU: +61 4 03751980 / FR: +33 7 82022899

<https://www.facebook.com/instaclustr>   <https://twitter.com/instaclustr>
<https://www.linkedin.com/company/instaclustr>

Read our latest technical blog posts here
<https://www.instaclustr.com/blog/>. This email has been sent on behalf
of Instaclustr Pty. Limited (Australia) and Instaclustr Inc (USA). This
email and any attachments may contain confidential and legally
privileged information.  If you are not the intended recipient, do not copy
or disclose its content, but please reply to this email immediately and
highlight the error to the sender and then immediately delete the message.

Reply via email to