RE: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread Cheng, Hao
Can you provide the detailed failure call stack? From: shahab [mailto:shahab.mok...@gmail.com] Sent: Tuesday, March 3, 2015 3:52 PM To: user@spark.apache.org Subject: Supporting Hive features in Spark SQL Thrift JDBC server Hi, According to Spark SQL documentation, Spark SQL supports

RE: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread Cheng, Hao
: Supporting Hive features in Spark SQL Thrift JDBC server val sc: SparkContext = new SparkContext(conf) val sqlCassContext = new CassandraAwareSQLContext(sc) // I used some Calliope Cassandra Spark connector val rdd : SchemaRDD = sqlCassContext.sql(select * from db.profile ) rdd.cache

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread Yin Huai
...@gmail.com] *Sent:* Tuesday, March 3, 2015 9:46 PM *To:* Cheng, Hao *Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server You are right , CassandraAwareSQLContext is subclass of SQL context. But I did another experiment, I queried Cassandra

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread Rohit Rai
:* Tuesday, March 3, 2015 9:46 PM *To:* Cheng, Hao *Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server You are right , CassandraAwareSQLContext is subclass of SQL context. But I did another experiment, I queried Cassandra using

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread shahab
*Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server You are right , CassandraAwareSQLContext is subclass of SQL context. But I did another experiment, I queried Cassandra using CassandraAwareSQLContext, then I registered the rdd as a temp

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread shahab
a direct sub class of HiveContext or SQLContext? *From:* shahab [mailto:shahab.mok...@gmail.com] *Sent:* Tuesday, March 3, 2015 5:10 PM *To:* Cheng, Hao *Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server val sc: SparkContext = new

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread shahab
? *From:* shahab [mailto:shahab.mok...@gmail.com] *Sent:* Tuesday, March 3, 2015 5:10 PM *To:* Cheng, Hao *Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server val sc: SparkContext = new SparkContext(conf) val sqlCassContext = new

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread Rohit Rai
/HiveContext? *From:* shahab [mailto:shahab.mok...@gmail.com] *Sent:* Tuesday, March 3, 2015 9:46 PM *To:* Cheng, Hao *Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server You are right , CassandraAwareSQLContext is subclass of SQL context

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread shahab
:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server You are right , CassandraAwareSQLContext is subclass of SQL context. But I did another experiment, I queried Cassandra using CassandraAwareSQLContext, then I registered the rdd as a temp table

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread Yin Huai
case that you need multiple SQLContext/HiveContext? *From:* shahab [mailto:shahab.mok...@gmail.com] *Sent:* Tuesday, March 3, 2015 9:46 PM *To:* Cheng, Hao *Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server You are right

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread shahab
SQLContext/HiveContext? *From:* shahab [mailto:shahab.mok...@gmail.com] *Sent:* Tuesday, March 3, 2015 9:46 PM *To:* Cheng, Hao *Cc:* user@spark.apache.org *Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server You are right , CassandraAwareSQLContext is subclass of SQL context

Re: Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-03 Thread shahab
...@intel.com wrote: Can you provide the detailed failure call stack? *From:* shahab [mailto:shahab.mok...@gmail.com] *Sent:* Tuesday, March 3, 2015 3:52 PM *To:* user@spark.apache.org *Subject:* Supporting Hive features in Spark SQL Thrift JDBC server Hi, According to Spark SQL

Supporting Hive features in Spark SQL Thrift JDBC server

2015-03-02 Thread shahab
Hi, According to Spark SQL documentation, Spark SQL supports the vast majority of Hive features, such as User Defined Functions( UDF) , and one of these UFDs is current_date() function, which should be supported. However, i get error when I am using this UDF in my SQL query. There are