Hello,

I'm working with UDT's and spark connector with these dependencies:

<scala.version>2.11.12</scala.version>
<spark.version>2.0.2</spark.version>
<cassandra-conector.version>2.0.7</cassandra-conector.version>
<cassandra-driver.version>3.4.0</cassandra-driver.version>


<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>${spark.version}</version>
</dependency>


<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>${cassandra-conector.version}</version>
</dependency>

<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>${cassandra-driver.version}</version>
</dependency>

So, with these dependencies I'm using scala 2.11, but I get this error
*the GettableToMappedTypeConverter which can't deserialize TypeTags
due
to Scala 2.10 TypeTag limitation. They come back as nulls and therefore
you see this NPE.*

Why do I get this error if I'm using SCALA 2.11? I want to read a
MAP[Int,<frozen>MyUDT>]
from Spark with the connector. The problem it's that if theree are any
field which it's not setted it's not possible.

If all fields are setted it works.



Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure:
Lost task 0.0 in stage 0.0 (TID 0, localhost):
java.lang.NullPointerException: Requested a TypeTag of the
GettableToMappedTypeConverter which can't deserialize TypeTags due to Scala
2.10 TypeTag limitation. They come back as nulls and therefore you see this
NPE.
    at
com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.targetTypeTag(GettableDataToMappedTypeConverter.scala:34)
    at
com.datastax.spark.connector.types.TypeConverter$AbstractMapConverter.valueTypeTag(TypeConverter.scala:707)
    at
com.datastax.spark.connector.types.TypeConverter$MapConverter$$typecreator45$1.apply(TypeConverter.scala:791)
    at
scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
    at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
    at
com.datastax.spark.connector.types.TypeConverter$class.targetTypeName(TypeConverter.scala:36)
    at
com.datastax.spark.connector.types.TypeConverter$CollectionConverter.targetTypeName(TypeConverter.scala:682)
    at
com.datastax.spark.connector.rdd.reader.GettableDataToMappedTypeConverter.tryConvert(GettableDataToMappedTypeConverter.scala:156)

Reply via email to