Given a hive context you could execute: hiveContext.sql("describe TABLE_NAME") you would get the name of the fields and their types
2015-02-04 21:47 GMT+01:00 nitinkak001 <nitinkak...@gmail.com>: > I want to get a Hive table schema details into Spark. Specifically, I want > to > get column name and type information. Is it possible to do it e.g using > JavaSchemaRDD or something else? > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/How-to-get-Hive-table-schema-using-Spark-SQL-or-otherwise-tp21501.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Re-How-to-get-Hive-table-schema-using-Spark-SQL-or-otherwise-tp21502.html Sent from the Apache Spark User List mailing list archive at Nabble.com.