Re: getting error: value toDF is not a member of Seq[columns]

2018-09-06 Thread Mich Talebzadeh
Ok somehow this worked! // Save prices to mongoDB collection val document = sparkContext.parallelize((1 to 1). map(i =>

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-06 Thread Mich Talebzadeh
thanks if you define columns class as below scala> case class columns(KEY: String, TICKER: String, TIMEISSUED: String, *PRICE: Double)* defined class columns scala> var df = Seq(columns("key", "ticker", "timeissued", 1.23f)).toDF df: org.apache.spark.sql.DataFrame = [KEY: string, TICKER: string

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-06 Thread Jungtaek Lim
This code works with Spark 2.3.0 via spark-shell. scala> case class columns(KEY: String, TICKER: String, TIMEISSUED: String, PRICE: Float) defined class columns scala> import spark.implicits._ import spark.implicits._ scala> var df = Seq(columns("key", "ticker", "timeissued", 1.23f)).toDF

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-06 Thread Mich Talebzadeh
I am trying to understand why spark cannot convert a simple comma separated columns as DF. I did a test I took one line of print and stored it as a one liner csv file as below var allInOne = key+","+ticker+","+timeissued+","+price println(allInOne) cat crap.csv

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Manu Zhang
Have you tried adding Encoder for columns as suggested by Jungtaek Lim ? On Thu, Sep 6, 2018 at 6:24 AM Mich Talebzadeh wrote: > > Dr Mich Talebzadeh > > > > LinkedIn * > https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw >

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Mich Talebzadeh
Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Mich Talebzadeh
yep already tried it and it did not work. thanks Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Deepak Sharma
Try this: *import **spark*.implicits._ df.toDF() On Wed, Sep 5, 2018 at 2:31 PM Mich Talebzadeh wrote: > With the following > > case class columns(KEY: String, TICKER: String, TIMEISSUED: String, PRICE: > Float) > > var key = line._2.split(',').view(0).toString > var ticker =

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Mich Talebzadeh
With the following case class columns(KEY: String, TICKER: String, TIMEISSUED: String, PRICE: Float) var key = line._2.split(',').view(0).toString var ticker = line._2.split(',').view(1).toString var timeissued = line._2.split(',').view(2).toString var price =

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Mich Talebzadeh
Thanks! The spark is version 2.3.0 Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com *Disclaimer:* Use it

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Jungtaek Lim
You may also find below link useful (though it looks far old), since case class is the thing which Encoder is available, so there may be another reason which prevent implicit conversion.

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Jungtaek Lim
Sorry I guess I pasted another method. the code is... implicit def localSeqToDatasetHolder[T : Encoder](s: Seq[T]): DatasetHolder[T] = { DatasetHolder(_sqlContext.createDataset(s)) } 2018년 9월 5일 (수) 오후 5:30, Jungtaek Lim 님이 작성: > I guess you need to have encoder for the type of result for

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Jungtaek Lim
I guess you need to have encoder for the type of result for columns(). https://github.com/apache/spark/blob/2119e518d31331e65415e0f817a6f28ff18d2b42/sql/core/src/main/scala/org/apache/spark/sql/SQLImplicits.scala#L227-L229 implicit def rddToDatasetHolder[T : Encoder](rdd: RDD[T]):

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Mich Talebzadeh
Thanks I already do that as below val sqlContext= new org.apache.spark.sql.SQLContext(sparkContext) import sqlContext.implicits._ but still getting the error! Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Jungtaek Lim
You may need to import implicits from your spark session like below: (Below code is borrowed from https://spark.apache.org/docs/latest/sql-programming-guide.html) import org.apache.spark.sql.SparkSession val spark = SparkSession .builder() .appName("Spark SQL basic example")

getting error: value toDF is not a member of Seq[columns]

2018-09-05 Thread Mich Talebzadeh
Hi, I have spark streaming that send data and I need to put that data into MongoDB for test purposes. The easiest way is to create a DF from the individual list of columns as below I loop over individual rows in RDD and perform the following case class columns(KEY: String, TICKER: String,