Yeah, we should probably remove that.

On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> If there is no option to let shell skip processing @VisibleForTesting ,
> should the annotation be dropped ?
>
> Cheers
>
> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
>
>> We've had this in the past when using "@VisibleForTesting" in classes
>> that for some reason the shell tries to process. QueryExecution.scala
>> seems to use that annotation and that was added recently, so that's
>> probably the issue.
>>
>> BTW, if anyone knows how Scala can find a reference to the original
>> Guava class even after shading, I'd really like to know. I've looked
>> several times and never found where the original class name is stored.
>>
>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang <zzh...@hortonworks.com>
>> wrote:
>> > Hi Folks,
>> >
>> > Does anybody meet the following issue? I use "mvn package -Phive
>> > -DskipTests” to build the package.
>> >
>> > Thanks.
>> >
>> > Zhan Zhang
>> >
>> >
>> >
>> > bin/spark-shell
>> > ...
>> > Spark context available as sc.
>> > error: error while loading QueryExecution, Missing dependency 'bad
>> symbolic
>> > reference. A signature in QueryExecution.class refers to term
>> annotations
>> > in package com.google.common which is not available.
>> > It may be completely missing from the current classpath, or the version
>> on
>> > the classpath might be incompatible with the version used when compiling
>> > QueryExecution.class.', required by
>> >
>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>> > <console>:10: error: not found: value sqlContext
>> >        import sqlContext.implicits._
>> >               ^
>> > <console>:10: error: not found: value sqlContext
>> >        import sqlContext.sql
>> >               ^
>>
>>
>>
>> --
>> Marcelo
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to