Scala compiler stores some metadata in the ScalaSig attribute. See the
following link as an example:

http://stackoverflow.com/questions/10130106/how-does-scala-know-the-difference-between-def-foo-and-def-foo/10130403#10130403

As maven-shade-plugin doesn't recognize ScalaSig, it cannot fix the
reference in it. Not sure if there is a Scala version of
`maven-shade-plugin` to deal with it.

Generally, annotations that will be shaded should not be used in the Scala
codes. I'm wondering if we can expose this issue in the PR build. Because
SBT build doesn't do the shading, now it's hard for us to find similar
issues in the PR build.

Best Regards,
Shixiong Zhu

2015-11-09 18:47 GMT-08:00 Ted Yu <yuzhih...@gmail.com>:

> Created https://github.com/apache/spark/pull/9585
>
> Cheers
>
> On Mon, Nov 9, 2015 at 6:39 PM, Josh Rosen <joshro...@databricks.com>
> wrote:
>
>> When we remove this, we should add a style-checker rule to ban the import
>> so that it doesn't get added back by accident.
>>
>> On Mon, Nov 9, 2015 at 6:13 PM, Michael Armbrust <mich...@databricks.com>
>> wrote:
>>
>>> Yeah, we should probably remove that.
>>>
>>> On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>>
>>>> If there is no option to let shell skip processing @VisibleForTesting
>>>> , should the annotation be dropped ?
>>>>
>>>> Cheers
>>>>
>>>> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin <van...@cloudera.com>
>>>> wrote:
>>>>
>>>>> We've had this in the past when using "@VisibleForTesting" in classes
>>>>> that for some reason the shell tries to process. QueryExecution.scala
>>>>> seems to use that annotation and that was added recently, so that's
>>>>> probably the issue.
>>>>>
>>>>> BTW, if anyone knows how Scala can find a reference to the original
>>>>> Guava class even after shading, I'd really like to know. I've looked
>>>>> several times and never found where the original class name is stored.
>>>>>
>>>>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang <zzh...@hortonworks.com>
>>>>> wrote:
>>>>> > Hi Folks,
>>>>> >
>>>>> > Does anybody meet the following issue? I use "mvn package -Phive
>>>>> > -DskipTests” to build the package.
>>>>> >
>>>>> > Thanks.
>>>>> >
>>>>> > Zhan Zhang
>>>>> >
>>>>> >
>>>>> >
>>>>> > bin/spark-shell
>>>>> > ...
>>>>> > Spark context available as sc.
>>>>> > error: error while loading QueryExecution, Missing dependency 'bad
>>>>> symbolic
>>>>> > reference. A signature in QueryExecution.class refers to term
>>>>> annotations
>>>>> > in package com.google.common which is not available.
>>>>> > It may be completely missing from the current classpath, or the
>>>>> version on
>>>>> > the classpath might be incompatible with the version used when
>>>>> compiling
>>>>> > QueryExecution.class.', required by
>>>>> >
>>>>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>>>>> > <console>:10: error: not found: value sqlContext
>>>>> >        import sqlContext.implicits._
>>>>> >               ^
>>>>> > <console>:10: error: not found: value sqlContext
>>>>> >        import sqlContext.sql
>>>>> >               ^
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Marcelo
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to