I'm using the driver from Cassandra-Spark, I would like to know if there is
a way to measure the time which takes a " joinWithCassandraTable" to
execute easily inside a spark code.
The poblem is that I have a method from Spark to insert into Cassandra:
childrenToInsertToParent.saveToCassandra("keyspace", "table",
SomeColumns("a","b","c","d"))
I have to put an sequence of String to say what columns I want to save. I
would like to do it dinamically, to do that I have to save t
Hello,
I'm working with UDT's and spark connector with these dependencies:
2.11.12
2.0.2
2.0.7
3.4.0
org.apache.spark
spark-core_2.11
${spark.version}
org.apache.spark
spark-streaming_2.11
${spark.version}
com.datastax.spark
spark-cassandra-connector_2.11
${cassandra-conector.version}
I'm trying to get a few columns from a Cassandra table from Spark and put
them in a case class. If I want all the columns that I have in my case
class works. But, I only want to bring a few of them and don't have a
specific case class for each case.
I tried to overload constructor in the case clas