Those are the same options I used, except I had —tgz to package it and I built 
off of the master branch. Unfortunately, my only guess is that these errors 
stem from your build environment.  In your spark assembly, do you have any 
classes which belong to the org.apache.hadoop.hive package?


From: Tridib Samanta <tridib.sama...@live.com<mailto:tridib.sama...@live.com>>
Date: Thursday, November 6, 2014 at 9:49 AM
To: Terry Siu <terry....@smartfocus.com<mailto:terry....@smartfocus.com>>, 
"u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>" 
<u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>>
Subject: RE: Unable to use HiveContext in spark-shell

I am using spark 1.1.0.
I built it using:
./make-distribution.sh -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive 
-DskipTests

My ultimate goal is to execute a query on parquet file with nested structure 
and cast a date string to Date. This is required to calculate the age of Person 
entity.
but I am even unable to pass this line:
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
I made sure that org.apache.hadoop package is in the spark assembly jar.

Re-attaching the stack trace for quick reference.

scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

error: bad symbolic reference. A signature in HiveContext.class refers to term 
hive
in package org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling 
HiveContext.class.
error:
     while compiling: <console>
        during phase: erasure
     library version: version 2.10.4
    compiler version: version 2.10.4
  reconstructed args:

  last tree to typer: Apply(value $outer)
              symbol: value $outer (flags: <method> <synthetic> <stable> 
<expandedname> <triedcooking>)
   symbol definition: val $outer(): $iwC.$iwC.type
                 tpe: $iwC.$iwC.type
       symbol owners: value $outer -> class $iwC -> class $iwC -> class $iwC -> 
class $read -> package $line5
      context owners: class $iwC -> class $iwC -> class $iwC -> class $iwC -> 
class $read -> package $line5

== Enclosing template or block ==

ClassDef( // class $iwC extends Serializable
  0
  "$iwC"
  []
  Template( // val <local $iwC>: <notype>, tree.tpe=$iwC
    "java.lang.Object", "scala.Serializable" // parents
    ValDef(
      private
      "_"
      <tpt>
      <empty>
    )
    // 5 statements
    DefDef( // def <init>(arg$outer: $iwC.$iwC.$iwC.type): $iwC
      <method> <triedcooking>
      "<init>"
      []
      // 1 parameter list
      ValDef( // $outer: $iwC.$iwC.$iwC.type

        "$outer"
        <tpt> // tree.tpe=$iwC.$iwC.$iwC.type
        <empty>
      )
      <tpt> // tree.tpe=$iwC
      Block( // tree.tpe=Unit
        Apply( // def <init>(): Object in class Object, tree.tpe=Object
          $iwC.super."<init>" // def <init>(): Object in class Object, 
tree.tpe=()Object
          Nil
        )
        ()
      )
    )
    ValDef( // private[this] val sqlContext: 
org.apache.spark.sql.hive.HiveContext
      private <local> <triedcooking>
      "sqlContext "
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext
      Apply( // def <init>(sc: org.apache.spark.SparkContext): 
org.apache.spark.sql.hive.HiveContext in class HiveContext, 
tree.tpe=org.apache.spark.sql.hive.HiveContext
        new org.apache.spark.sql.hive.HiveContext."<init>" // def <init>(sc: 
org.apache.spark.SparkContext): org.apache.spark.sql.hive.HiveContext in class 
HiveContext, tree.tpe=(sc: 
org.apache.spark.SparkContext)org.apache.spark.sql.hive.HiveContext
        Apply( // val sc(): org.apache.spark.SparkContext, 
tree.tpe=org.apache.spark.SparkContext
          
$iwC.this.$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$$outer().$VAL1().$iw().$iw()."sc"
 // val sc(): org.apache.spark.SparkContext, 
tree.tpe=()org.apache.spark.SparkContext
          Nil
        )
      )
    )
    DefDef( // val sqlContext(): org.apache.spark.sql.hive.HiveContext
      <method> <stable> <accessor>
      "sqlContext"
      []
      List(Nil)
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext
      $iwC.this."sqlContext " // private[this] val sqlContext: 
org.apache.spark.sql.hive.HiveContext, 
tree.tpe=org.apache.spark.sql.hive.HiveContext
    )
    ValDef( // protected val $outer: $iwC.$iwC.$iwC.type
      protected <synthetic> <paramaccessor> <triedcooking>
      "$outer "
      <tpt> // tree.tpe=$iwC.$iwC.$iwC.type
      <empty>
    )
    DefDef( // val $outer(): $iwC.$iwC.$iwC.type
      <method> <synthetic> <stable> <expandedname> <triedcooking>
      "$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer"
      []
      List(Nil)
      <tpt> // tree.tpe=Any
      $iwC.this."$outer " // protected val $outer: $iwC.$iwC.$iwC.type, 
tree.tpe=$iwC.$iwC.$iwC.type
    )
  )
)

== Expanded type of tree ==

ThisType(class $iwC)

uncaught exception during compilation: scala.reflect.internal.Types$TypeError
scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature in 
HiveContext.class refers to term conf
in value org.apache.hadoop.hive which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling 
HiveContext.class.
That entry seems to have slain the compiler.  Shall I replay
your session? I can re-run each line except the last one.
[y/n]


Thanks
Tridib

> From: terry....@smartfocus.com<mailto:terry....@smartfocus.com>
> To: tridib.sama...@live.com<mailto:tridib.sama...@live.com>; 
> u...@spark.incubator.apache.org<mailto:u...@spark.incubator.apache.org>
> Subject: Re: Unable to use HiveContext in spark-shell
> Date: Thu, 6 Nov 2014 17:38:51 +0000
>
> What version of Spark are you using? Did you compile your Spark version
> and if so, what compile options did you use?
>
> On 11/6/14, 9:22 AM, "tridib" 
> <tridib.sama...@live.com<mailto:tridib.sama...@live.com>> wrote:
>
> >Help please!
> >
> >
> >
> >--
> >View this message in context:
> >http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveCont
> >ext-in-spark-shell-tp18261p18280.html
> >Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> >---------------------------------------------------------------------
> >To unsubscribe, e-mail: 
> >user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
> >For additional commands, e-mail: 
> >user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>
> >
>


Reply via email to