Hello, 

I have a weird issue, this request works fine:
sqlContext.sql("SELECT customer_id FROM transactions WHERE purchaseamount >=
200").count()

However, when I cache the table before making the request:
sqlContext.cacheTable("transactions")
sqlContext.sql("SELECT customer_id FROM transactions WHERE purchaseamount >=
200").count()

I am getting an exception on of the task:
: org.apache.spark.SparkException: Job aborted due to stage failure: Task
120 in stage 104.0 failed 4 times, most recent failure: Lost task 120.3 in
stage 104.0 (TID 20537, spark-w-0.c.internal): java.lang.ClassCastException: 

(I have no details after the ':')

Any ideas of what could be wrong? 




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-Exception-only-when-using-cacheTable-tp16031.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to