I mentioned this in the RC2 note for 2.2.0 of Spark and i'm seeing it now
on the official release. Running the Spark Casasnadra Connector integration
tests for the SCC now fail whenever trying to do something involving the
CassandraSource being transformed into the DataSourceScanExec SparkPlan.

https://github.com/apache/spark/blob/v2.2.0/sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala#L70

Utils.redact(SparkSession.getActiveSession.get.sparkContext.conf, text)
This leads to an None.get (full exception below)

This only seems to reproduce when I run from within sbt, running through
the IntelliJ scalaTest runner works fine on the same code. This makes me
think that perhaps something about how sbt is loading the

org.apache.spark.scheduler.DAGSchedulerEventProcessLoop

class is somehow avoiding having it have access to the getActiveSession

I'm wondering if anyone else has run into this and found a workaround, I
saw a similar report posted to the end of this ticket

https://issues.apache.org/jira/browse/SPARK-16599?focusedCommentId=16038185&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16038185

I have tried setting the ActiveSession but it doesn't seem relevant to the
thread which ends up calling getActiveSession


Thanks for your time,
Russ

The failure is

java.util.NoSuchElementException: None.get
[info] at scala.None$.get(Option.scala:347)
[info] at scala.None$.get(Option.scala:345)
[info] at org.apache.spark.sql.execution.DataSourceScanExec$class.org
$apache$spark$sql$execution$DataSourceScanExec$$redact(DataSourceScanExec.scala:70)
[info] at
org.apache.spark.sql.execution.DataSourceScanExec$$anonfun$4.apply(DataSourceScanExec.scala:54)
[info] at
org.apache.spark.sql.execution.DataSourceScanExec$$anonfun$4.apply(DataSourceScanExec.scala:52)
[info] at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
[info] at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
[info] at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
[info] at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
[info] at
scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
[info] at scala.collection.AbstractTraversable.map(Traversable.scala:104)
[info] at
org.apache.spark.sql.execution.DataSourceScanExec$class.simpleString(DataSourceScanExec.scala:52)
[info] at
org.apache.spark.sql.execution.RowDataSourceScanExec.simpleString(DataSourceScanExec.scala:75)
[info] at
org.apache.spark.sql.catalyst.plans.QueryPlan.verboseString(QueryPlan.scala:349)
...

I can post the full exception with the attempt to serialize and scalaTest
wrapping if anyone wants to see that but it is quite long.

Reply via email to