in all 2.x version.
>
> Regards,
> Vinayak Joshi
-
Liang-Chi Hsieh | @viirya
Spark Technology Center
http://www.spark.tc/
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-SQL-Dataframe-resulting-from-an-except-is-unusable-tp20802p20812.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
With Spark 2.x, I construct a Dataframe from a sample libsvm file:
scala> val higgsDF = spark.read.format("libsvm").load("higgs.libsvm")
higgsDF: org.apache.spark.sql.DataFrame = [label: double, features:
vector]
Then, build a new dataframe that involves an except( )
scala> val train_df =