Hi all,

I'm trying to use spark sql to store data in parquet file. I create the
file and insert data into it with the following code :




















*    val conf = new SparkConf().setAppName("MCT").setMaster("local[2]")
val sc = new SparkContext(conf)    val sqlContext = new SQLContext(sc)
import sqlContext._    val personParquet =
createParquetFile[Person]("people_1.pqt")
personParquet.registerAsTable("people")    val data =
sc.parallelize(Seq(Person("Toto", 10), Person("foo", 101)))
data.insertInto("people")    personParquet.collect foreach(println)
data.insertInto("people")    val personParquet2 =
parquetFile("people_1.pqt")    personParquet2.collect foreach(println)*


It works as I expect when I run it in spark-shell. But with a stand alone
application, I get a build error :

   *MCT.scala:18: not found: value createParquetFile*

If I skip this creation set and save the rdd as parquet file directly it
works. But then, when I insert new data nothing happen.

What I'm doing wrong ?

Best regards,


Jaonary

Reply via email to