Yes, you can write to Parquet tables. On Spark 1.0.2 all I had to do was 
include the parquet-hive-bundle-1.5.0.jar on my classpath.

From: lyc<mailto:yanchen....@huawei.com>
Sent: ?Friday?, ?August? ?15?, ?2014 ?7?:?30? ?PM
To: u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>

Since SqlContext supports less SQL than Hive (if I understand correctly), I
plan to run more queries by hql. However, is that possible to create some
tables as Parquet in hql? What kind of commands should I use? Thanks in
advance for any information.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Does-HiveContext-support-Parquet-tp12209.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to