Try this in Spark shell:
|import org.apache.spark.api.java.JavaSparkContext
import org.apache.spark.sql.hive.HiveContext
val jsc = new JavaSparkContext(sc)
val hc = new HiveContext(jsc.sc)
|
(I never mentioned that JavaSparkContext extends SparkContext…)
Cheng
On 3/30/15 8:28 PM,
thanks. That is what I have tried. JavaSparkContext does not extend
SparkContext, it can not be used here.
Anyone else know whether we can use HiveContext with JavaSparkContext, from
API documents, seems this is not supported. thanks.
On Sun, Mar 29, 2015 at 9:24 AM, Cheng Lian
You may simply pass in JavaSparkContext.sc
On 3/29/15 9:25 PM, Vincent He wrote:
All,
I try Spark SQL with Java, I find HiveContext does not accept
JavaSparkContext, is this true? Or any special build of Spark I need
to do (I build with Hive and thrift server)? Can we use HiveContext in
thanks .
It does not work, and can not pass compile as HiveContext constructor does
not accept JaveSparkContext and JaveSparkContext is not subclass of
SparkContext.
Anyone else have any idea? I suspect this is supported now.
On Sun, Mar 29, 2015 at 8:54 AM, Cheng Lian lian.cs@gmail.com
All,
I try Spark SQL with Java, I find HiveContext does not accept
JavaSparkContext, is this true? Or any special build of Spark I need to do
(I build with Hive and thrift server)? Can we use HiveContext in Java?
thanks in advance.
I mean JavaSparkContext has a field name sc, whose type is
SparkContext. You may pass this sc to HiveContext.
On 3/29/15 9:59 PM, Vincent He wrote:
thanks .
It does not work, and can not pass compile as HiveContext constructor
does not accept JaveSparkContext and JaveSparkContext is not