Re: HQL function Rollup and Cube
Yes, it works for me. Make sure the Spark machine can access the hive machine. On Thu, Mar 26, 2015 at 6:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote: Did you manage to connect to Hive metastore from Spark SQL. I copied hive conf file into Spark conf folder but when i run show tables, or do select * from dw_bid (dw_bid is stored in Hive) it says table not found. On Thu, Mar 26, 2015 at 11:43 PM, Chang Lim chang...@gmail.com wrote: Solved. In IDE, project settings was missing the dependent lib jars (jar files under spark-xx/lib). When theses jar is not set, I got class not found error about datanucleus classes (compared to an out of memory error in Spark Shell). In the context of Spark Shell, these dependent jars needs to be passed in at the spark-shell command line. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/HQL-function-Rollup-and-Cube-tp22241p22246.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Deepak
Re: HQL function Rollup and Cube
Clarification on how the HQL was invoked: hiveContext.sql(select a, b, count(*) from t group by a, b with rollup) Thanks, Chang -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/HQL-function-Rollup-and-Cube-tp22241p22244.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
HQL function Rollup and Cube
Has anyone been able to use Hive 0.13 ROLLUP and CUBE functions in Spark 1.3's Hive Context? According to https://issues.apache.org/jira/browse/SPARK-2663, this has been resolved in Spark 1.3. I created an in-memory temp table (t) and tried to execute a ROLLUP(and CUBE) function: select a, b, count(*) from t group by a, b with rollup Got the error that with rollup is an invalid function. Am I missing something? Thanks, Chang -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/HQL-function-Rollup-and-Cube-tp22241.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: HQL function Rollup and Cube
Solved. In IDE, project settings was missing the dependent lib jars (jar files under spark-xx/lib). When theses jar is not set, I got class not found error about datanucleus classes (compared to an out of memory error in Spark Shell). In the context of Spark Shell, these dependent jars needs to be passed in at the spark-shell command line. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/HQL-function-Rollup-and-Cube-tp22241p22246.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: HQL function Rollup and Cube
Did you manage to connect to Hive metastore from Spark SQL. I copied hive conf file into Spark conf folder but when i run show tables, or do select * from dw_bid (dw_bid is stored in Hive) it says table not found. On Thu, Mar 26, 2015 at 11:43 PM, Chang Lim chang...@gmail.com wrote: Solved. In IDE, project settings was missing the dependent lib jars (jar files under spark-xx/lib). When theses jar is not set, I got class not found error about datanucleus classes (compared to an out of memory error in Spark Shell). In the context of Spark Shell, these dependent jars needs to be passed in at the spark-shell command line. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/HQL-function-Rollup-and-Cube-tp22241p22246.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org -- Deepak