Hmm, I just tested my own Spark 1.3.0 build. I have the same problem, but I 
cannot reproduce it on Spark 1.2.1
If we check the code change below:
Spark 1.3 
branchhttps://github.com/apache/spark/blob/branch-1.3/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala
vs 
Spark 1.2 
branchhttps://github.com/apache/spark/blob/branch-1.2/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala
You can see that on line 24:
import com.google.common.cache.{CacheBuilder, CacheLoader, LoadingCache}
is introduced on 1.3 branch.
The error basically mean runtime com.google.common.cache package cannot be 
found in the classpath.
Either you and me made the same mistake when we build Spark 1.3.0, or there are 
something wrong with Spark 1.3 pom.xml file.
Here is how I built the 1.3.0:
1) Download the spark 1.3.0 source2) make-distribution --targz 
-Dhadoop.version=1.1.1 -Phive -Phive-0.12.0 -Phive-thriftserver -DskipTests
Is this only due to that I built against Hadoop 1.x?
Yong

Date: Thu, 2 Apr 2015 13:56:33 -0400
Subject: Spark SQL 1.3.0 - spark-shell error : HiveMetastoreCatalog.class 
refers to term cache in package com.google.common which is not available
From: tsind...@gmail.com
To: user@spark.apache.org

I was trying a simple test from the spark-shell to see if 1.3.0 would address a 
problem I was having with locating the json_tuple class and got the following 
error:
scala> import org.apache.spark.sql.hive._
import org.apache.spark.sql.hive._

scala> val sqlContext = new HiveContext(sc)
sqlContext: org.apache.spark.sql.hive.HiveContext = 
org.apache.spark.sql.hive.HiveContext@79c849c7

scala> import sqlContext._
import sqlContext._

scala> case class MetricTable(path: String, pathElements: String, name: String, 
value: String)
scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature in 
HiveMetastoreCatalog.class refers to term cache
in package com.google.common which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling 
HiveMetastoreCatalog.class.
That entry seems to have slain the compiler.  Shall I replay
your session? I can re-run each line except the last one.
[y/n]
Abandoning crashed session.I entered the shell as follows:./bin/spark-shell 
--master spark://radtech.io:7077 --total-executor-cores 2 --driver-class-path 
/usr/local/spark/lib/mysql-connector-java-5.1.34-bin.jarhive-site.xml looks 
like this:<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>
  <property>
    <name>hive.semantic.analyzer.factory.impl</name>
    <value>org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory</value>
  </property>

  <property>
    <name>hive.metastore.sasl.enabled</name>
    <value>false</value>
  </property>

  <property>
    <name>hive.server2.authentication</name>
    <value>NONE</value>
  </property>

  <property>
    <name>hive.server2.enable.doAs</name>
    <value>true</value>
  </property>

  <property>
    <name>hive.warehouse.subdir.inherit.perms</name>
    <value>true</value>
  </property>

  <property>
    <name>hive.metastore.schema.verification</name>
    <value>false</value>
  </property>

  <property>
    <name>javax.jdo.option.ConnectionURL</name>
    
<value>jdbc:mysql://localhost:3306/metastore_db?createDatabaseIfNotExist=true</value>
    <description>metadata is stored in a MySQL server</description>
  </property>

  <property>
    <name>javax.jdo.option.ConnectionDriverName</name>
    <value>com.mysql.jdbc.Driver</value>
    <description>MySQL JDBC driver class</description>
  </property>

  <property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>*******</value>
  </property>

  <property>
    <name>javax.jdo.option.ConnectionPassword</name>
    <value>********</value>
  </property>

</configuration>I have downloaded a clean version of 1.3.0 and tried it again 
but same error. Is this a know issue? Or a configuration issue on my part?TIA 
for the assistances.-Todd                                     

Reply via email to