Can you provide the detailed failure call stack?
From: shahab [mailto:shahab.mok...@gmail.com]
Sent: Tuesday, March 3, 2015 3:52 PM
To: user@spark.apache.org
Subject: Supporting Hive features in Spark SQL Thrift JDBC server
Hi,
According to Spark SQL documentation, Spark SQL supports
: Supporting Hive features in Spark SQL Thrift JDBC server
val sc: SparkContext = new SparkContext(conf)
val sqlCassContext = new CassandraAwareSQLContext(sc) // I used some
Calliope Cassandra Spark connector
val rdd : SchemaRDD = sqlCassContext.sql(select * from db.profile )
rdd.cache
...@gmail.com]
*Sent:* Tuesday, March 3, 2015 9:46 PM
*To:* Cheng, Hao
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
You are right , CassandraAwareSQLContext is subclass of SQL context.
But I did another experiment, I queried Cassandra
:* Tuesday, March 3, 2015 9:46 PM
*To:* Cheng, Hao
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
You are right , CassandraAwareSQLContext is subclass of SQL context.
But I did another experiment, I queried Cassandra
using
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
You are right , CassandraAwareSQLContext is subclass of SQL context.
But I did another experiment, I queried Cassandra
using CassandraAwareSQLContext, then I registered the rdd as a temp
a direct sub class of HiveContext or
SQLContext?
*From:* shahab [mailto:shahab.mok...@gmail.com]
*Sent:* Tuesday, March 3, 2015 5:10 PM
*To:* Cheng, Hao
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
val sc: SparkContext = new
?
*From:* shahab [mailto:shahab.mok...@gmail.com]
*Sent:* Tuesday, March 3, 2015 5:10 PM
*To:* Cheng, Hao
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
val sc: SparkContext = new SparkContext(conf)
val sqlCassContext = new
/HiveContext?
*From:* shahab [mailto:shahab.mok...@gmail.com]
*Sent:* Tuesday, March 3, 2015 9:46 PM
*To:* Cheng, Hao
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
You are right , CassandraAwareSQLContext is subclass of SQL context
:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC
server
You are right , CassandraAwareSQLContext is subclass of SQL context.
But I did another experiment, I queried Cassandra
using CassandraAwareSQLContext, then I registered the rdd as a temp
table
case that you need multiple
SQLContext/HiveContext?
*From:* shahab [mailto:shahab.mok...@gmail.com]
*Sent:* Tuesday, March 3, 2015 9:46 PM
*To:* Cheng, Hao
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
You are right
SQLContext/HiveContext?
*From:* shahab [mailto:shahab.mok...@gmail.com]
*Sent:* Tuesday, March 3, 2015 9:46 PM
*To:* Cheng, Hao
*Cc:* user@spark.apache.org
*Subject:* Re: Supporting Hive features in Spark SQL Thrift JDBC server
You are right , CassandraAwareSQLContext is subclass of SQL context
...@intel.com wrote:
Can you provide the detailed failure call stack?
*From:* shahab [mailto:shahab.mok...@gmail.com]
*Sent:* Tuesday, March 3, 2015 3:52 PM
*To:* user@spark.apache.org
*Subject:* Supporting Hive features in Spark SQL Thrift JDBC server
Hi,
According to Spark SQL
Hi,
According to Spark SQL documentation, Spark SQL supports the vast
majority of Hive features, such as User Defined Functions( UDF) , and one
of these UFDs is current_date() function, which should be supported.
However, i get error when I am using this UDF in my SQL query. There are
13 matches
Mail list logo