How can i create a date field in spark sql? I have a S3 table and  i load it
into a RDD.

 

val sqlContext = new org.apache.spark.sql.SQLContext(sc)

import sqlContext.createSchemaRDD

 

case class trx_u3m(id: String, local: String, fechau3m: String, rubro: Int,
sku: String, unidades: Double, monto: Double)

 

val tabla =
sc.textFile("s3n://exalitica.com/trx_u3m/trx_u3m.txt").map(_.split(",")).map
(p => trx_u3m(p(0).trim.toString, p(1).trim.toString, p(2).trim.toString,
p(3).trim.toInt, p(4).trim.toString, p(5).trim.toDouble,
p(6).trim.toDouble))

tabla.registerTempTable("trx_u3m")

 

Now my problema i show can i transform string variable into date variables
(fechau3m)?

 

Franco Barrientos
Data Scientist

Málaga #115, Of. 1003, Las Condes.
Santiago, Chile.
(+562)-29699649
(+569)-76347893

 <mailto:franco.barrien...@exalitica.com> franco.barrien...@exalitica.com 

 <http://www.exalitica.com/> www.exalitica.com


  <http://exalitica.com/web/img/frim.png> 

 

Reply via email to