Hi Zhan:

I hit the exact problem you hit. I rolled back to commit:

de289bf279e14e47859b5fbcd70e97b9d0759f14

which does not have this problem. I suspect something delivered in the past
4 days caused this problem.

On Mon, Nov 9, 2015 at 12:20 PM Ted Yu <yuzhih...@gmail.com> wrote:

> I backtracked to:
> ef362846eb448769bcf774fc9090a5013d459464
>
> The issue was still there.
>
> FYI
>
> On Mon, Nov 9, 2015 at 10:46 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> Which branch did you perform the build with ?
>>
>> I used the following command yesterday:
>> mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4 -Dhadoop.version=2.7.0
>> package -DskipTests
>>
>> Spark shell was working.
>>
>> Building with latest master branch.
>>
>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang <zzh...@hortonworks.com>
>> wrote:
>>
>>> Hi Folks,
>>>
>>> Does anybody meet the following issue? I use "mvn package -Phive
>>> -DskipTests” to build the package.
>>>
>>> Thanks.
>>>
>>> Zhan Zhang
>>>
>>>
>>>
>>> bin/spark-shell
>>> *...*
>>> Spark context available as sc.
>>> error: error while loading QueryExecution, Missing dependency 'bad
>>> symbolic reference. A signature in QueryExecution.class refers to term
>>> annotations
>>> in package com.google.common which is not available.
>>> It may be completely missing from the current classpath, or the version
>>> on
>>> the classpath might be incompatible with the version used when compiling
>>> QueryExecution.class.', required by
>>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>>> <console>:10: error: not found: value sqlContext
>>>        import sqlContext.implicits._
>>>               ^
>>> <console>:10: error: not found: value sqlContext
>>>        import sqlContext.sql
>>>               ^
>>>
>>
>>
>

Reply via email to