S3 table to spark sql

2014-11-11 Thread Franco Barrientos
How can i create a date field in spark sql? I have a S3 table and i load it into a RDD. val sqlContext = new org.apache.spark.sql.SQLContext(sc) import sqlContext.createSchemaRDD case class trx_u3m(id: String, local: String, fechau3m: String, rubro: Int, sku: String, unidades: Double,

Re: S3 table to spark sql

2014-11-11 Thread Rishi Yadav
simple scala val date = new java.text.SimpleDateFormat(mmdd).parse(fechau3m) should work. Replace mmdd with the format fechau3m is in. If you want to do it at case class level: val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc) //HiveContext always a good idea import