Re: Anybody hit this issue in spark shell?

2015-11-11 Thread Ted Yu
I searched code base and confirmed that there is no class from
com.google.common.annotations being used.

However, there're classes from com.google.common
e.g.

import com.google.common.io.{ByteStreams, Files}
import com.google.common.net.InetAddresses

FYI

On Tue, Nov 10, 2015 at 11:22 AM, Shixiong Zhu  wrote:

> Scala compiler stores some metadata in the ScalaSig attribute. See the
> following link as an example:
>
>
> http://stackoverflow.com/questions/10130106/how-does-scala-know-the-difference-between-def-foo-and-def-foo/10130403#10130403
>
> As maven-shade-plugin doesn't recognize ScalaSig, it cannot fix the
> reference in it. Not sure if there is a Scala version of
> `maven-shade-plugin` to deal with it.
>
> Generally, annotations that will be shaded should not be used in the Scala
> codes. I'm wondering if we can expose this issue in the PR build. Because
> SBT build doesn't do the shading, now it's hard for us to find similar
> issues in the PR build.
>
> Best Regards,
> Shixiong Zhu
>
> 2015-11-09 18:47 GMT-08:00 Ted Yu :
>
>> Created https://github.com/apache/spark/pull/9585
>>
>> Cheers
>>
>> On Mon, Nov 9, 2015 at 6:39 PM, Josh Rosen 
>> wrote:
>>
>>> When we remove this, we should add a style-checker rule to ban the
>>> import so that it doesn't get added back by accident.
>>>
>>> On Mon, Nov 9, 2015 at 6:13 PM, Michael Armbrust >> > wrote:
>>>
 Yeah, we should probably remove that.

 On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu  wrote:

> If there is no option to let shell skip processing @VisibleForTesting
> , should the annotation be dropped ?
>
> Cheers
>
> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin 
> wrote:
>
>> We've had this in the past when using "@VisibleForTesting" in classes
>> that for some reason the shell tries to process. QueryExecution.scala
>> seems to use that annotation and that was added recently, so that's
>> probably the issue.
>>
>> BTW, if anyone knows how Scala can find a reference to the original
>> Guava class even after shading, I'd really like to know. I've looked
>> several times and never found where the original class name is stored.
>>
>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
>> wrote:
>> > Hi Folks,
>> >
>> > Does anybody meet the following issue? I use "mvn package -Phive
>> > -DskipTests” to build the package.
>> >
>> > Thanks.
>> >
>> > Zhan Zhang
>> >
>> >
>> >
>> > bin/spark-shell
>> > ...
>> > Spark context available as sc.
>> > error: error while loading QueryExecution, Missing dependency 'bad
>> symbolic
>> > reference. A signature in QueryExecution.class refers to term
>> annotations
>> > in package com.google.common which is not available.
>> > It may be completely missing from the current classpath, or the
>> version on
>> > the classpath might be incompatible with the version used when
>> compiling
>> > QueryExecution.class.', required by
>> >
>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>> > :10: error: not found: value sqlContext
>> >import sqlContext.implicits._
>> >   ^
>> > :10: error: not found: value sqlContext
>> >import sqlContext.sql
>> >   ^
>>
>>
>>
>> --
>> Marcelo
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

>>>
>>
>


Re: Anybody hit this issue in spark shell?

2015-11-10 Thread Shixiong Zhu
Scala compiler stores some metadata in the ScalaSig attribute. See the
following link as an example:

http://stackoverflow.com/questions/10130106/how-does-scala-know-the-difference-between-def-foo-and-def-foo/10130403#10130403

As maven-shade-plugin doesn't recognize ScalaSig, it cannot fix the
reference in it. Not sure if there is a Scala version of
`maven-shade-plugin` to deal with it.

Generally, annotations that will be shaded should not be used in the Scala
codes. I'm wondering if we can expose this issue in the PR build. Because
SBT build doesn't do the shading, now it's hard for us to find similar
issues in the PR build.

Best Regards,
Shixiong Zhu

2015-11-09 18:47 GMT-08:00 Ted Yu :

> Created https://github.com/apache/spark/pull/9585
>
> Cheers
>
> On Mon, Nov 9, 2015 at 6:39 PM, Josh Rosen 
> wrote:
>
>> When we remove this, we should add a style-checker rule to ban the import
>> so that it doesn't get added back by accident.
>>
>> On Mon, Nov 9, 2015 at 6:13 PM, Michael Armbrust 
>> wrote:
>>
>>> Yeah, we should probably remove that.
>>>
>>> On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu  wrote:
>>>
 If there is no option to let shell skip processing @VisibleForTesting
 , should the annotation be dropped ?

 Cheers

 On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin 
 wrote:

> We've had this in the past when using "@VisibleForTesting" in classes
> that for some reason the shell tries to process. QueryExecution.scala
> seems to use that annotation and that was added recently, so that's
> probably the issue.
>
> BTW, if anyone knows how Scala can find a reference to the original
> Guava class even after shading, I'd really like to know. I've looked
> several times and never found where the original class name is stored.
>
> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
> wrote:
> > Hi Folks,
> >
> > Does anybody meet the following issue? I use "mvn package -Phive
> > -DskipTests” to build the package.
> >
> > Thanks.
> >
> > Zhan Zhang
> >
> >
> >
> > bin/spark-shell
> > ...
> > Spark context available as sc.
> > error: error while loading QueryExecution, Missing dependency 'bad
> symbolic
> > reference. A signature in QueryExecution.class refers to term
> annotations
> > in package com.google.common which is not available.
> > It may be completely missing from the current classpath, or the
> version on
> > the classpath might be incompatible with the version used when
> compiling
> > QueryExecution.class.', required by
> >
> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
> > :10: error: not found: value sqlContext
> >import sqlContext.implicits._
> >   ^
> > :10: error: not found: value sqlContext
> >import sqlContext.sql
> >   ^
>
>
>
> --
> Marcelo
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

>>>
>>
>


Re: Anybody hit this issue in spark shell?

2015-11-10 Thread Ted Yu
In the PR, a new scala style rule is added banning use of @VisibleForTesting

Similar rules can be added as seen fit.

Cheers

On Tue, Nov 10, 2015 at 11:22 AM, Shixiong Zhu  wrote:

> Scala compiler stores some metadata in the ScalaSig attribute. See the
> following link as an example:
>
>
> http://stackoverflow.com/questions/10130106/how-does-scala-know-the-difference-between-def-foo-and-def-foo/10130403#10130403
>
> As maven-shade-plugin doesn't recognize ScalaSig, it cannot fix the
> reference in it. Not sure if there is a Scala version of
> `maven-shade-plugin` to deal with it.
>
> Generally, annotations that will be shaded should not be used in the Scala
> codes. I'm wondering if we can expose this issue in the PR build. Because
> SBT build doesn't do the shading, now it's hard for us to find similar
> issues in the PR build.
>
> Best Regards,
> Shixiong Zhu
>
> 2015-11-09 18:47 GMT-08:00 Ted Yu :
>
>> Created https://github.com/apache/spark/pull/9585
>>
>> Cheers
>>
>> On Mon, Nov 9, 2015 at 6:39 PM, Josh Rosen 
>> wrote:
>>
>>> When we remove this, we should add a style-checker rule to ban the
>>> import so that it doesn't get added back by accident.
>>>
>>> On Mon, Nov 9, 2015 at 6:13 PM, Michael Armbrust >> > wrote:
>>>
 Yeah, we should probably remove that.

 On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu  wrote:

> If there is no option to let shell skip processing @VisibleForTesting
> , should the annotation be dropped ?
>
> Cheers
>
> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin 
> wrote:
>
>> We've had this in the past when using "@VisibleForTesting" in classes
>> that for some reason the shell tries to process. QueryExecution.scala
>> seems to use that annotation and that was added recently, so that's
>> probably the issue.
>>
>> BTW, if anyone knows how Scala can find a reference to the original
>> Guava class even after shading, I'd really like to know. I've looked
>> several times and never found where the original class name is stored.
>>
>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
>> wrote:
>> > Hi Folks,
>> >
>> > Does anybody meet the following issue? I use "mvn package -Phive
>> > -DskipTests” to build the package.
>> >
>> > Thanks.
>> >
>> > Zhan Zhang
>> >
>> >
>> >
>> > bin/spark-shell
>> > ...
>> > Spark context available as sc.
>> > error: error while loading QueryExecution, Missing dependency 'bad
>> symbolic
>> > reference. A signature in QueryExecution.class refers to term
>> annotations
>> > in package com.google.common which is not available.
>> > It may be completely missing from the current classpath, or the
>> version on
>> > the classpath might be incompatible with the version used when
>> compiling
>> > QueryExecution.class.', required by
>> >
>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>> > :10: error: not found: value sqlContext
>> >import sqlContext.implicits._
>> >   ^
>> > :10: error: not found: value sqlContext
>> >import sqlContext.sql
>> >   ^
>>
>>
>>
>> --
>> Marcelo
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

>>>
>>
>


Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Marcelo Vanzin
We've had this in the past when using "@VisibleForTesting" in classes
that for some reason the shell tries to process. QueryExecution.scala
seems to use that annotation and that was added recently, so that's
probably the issue.

BTW, if anyone knows how Scala can find a reference to the original
Guava class even after shading, I'd really like to know. I've looked
several times and never found where the original class name is stored.

On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang  wrote:
> Hi Folks,
>
> Does anybody meet the following issue? I use "mvn package -Phive
> -DskipTests” to build the package.
>
> Thanks.
>
> Zhan Zhang
>
>
>
> bin/spark-shell
> ...
> Spark context available as sc.
> error: error while loading QueryExecution, Missing dependency 'bad symbolic
> reference. A signature in QueryExecution.class refers to term annotations
> in package com.google.common which is not available.
> It may be completely missing from the current classpath, or the version on
> the classpath might be incompatible with the version used when compiling
> QueryExecution.class.', required by
> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
> :10: error: not found: value sqlContext
>import sqlContext.implicits._
>   ^
> :10: error: not found: value sqlContext
>import sqlContext.sql
>   ^



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Marcelo Vanzin
On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu  wrote:
> If there is no option to let shell skip processing @VisibleForTesting ,
> should the annotation be dropped ?

That's what we did last time this showed up.

> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin  wrote:
>>
>> We've had this in the past when using "@VisibleForTesting" in classes
>> that for some reason the shell tries to process. QueryExecution.scala
>> seems to use that annotation and that was added recently, so that's
>> probably the issue.
>>
>> BTW, if anyone knows how Scala can find a reference to the original
>> Guava class even after shading, I'd really like to know. I've looked
>> several times and never found where the original class name is stored.
>>
>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
>> wrote:
>> > Hi Folks,
>> >
>> > Does anybody meet the following issue? I use "mvn package -Phive
>> > -DskipTests” to build the package.
>> >
>> > Thanks.
>> >
>> > Zhan Zhang
>> >
>> >
>> >
>> > bin/spark-shell
>> > ...
>> > Spark context available as sc.
>> > error: error while loading QueryExecution, Missing dependency 'bad
>> > symbolic
>> > reference. A signature in QueryExecution.class refers to term
>> > annotations
>> > in package com.google.common which is not available.
>> > It may be completely missing from the current classpath, or the version
>> > on
>> > the classpath might be incompatible with the version used when compiling
>> > QueryExecution.class.', required by
>> >
>> > /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>> > :10: error: not found: value sqlContext
>> >import sqlContext.implicits._
>> >   ^
>> > :10: error: not found: value sqlContext
>> >import sqlContext.sql
>> >   ^
>>
>>
>>
>> --
>> Marcelo
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Michael Armbrust
Yeah, we should probably remove that.

On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu  wrote:

> If there is no option to let shell skip processing @VisibleForTesting ,
> should the annotation be dropped ?
>
> Cheers
>
> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin 
> wrote:
>
>> We've had this in the past when using "@VisibleForTesting" in classes
>> that for some reason the shell tries to process. QueryExecution.scala
>> seems to use that annotation and that was added recently, so that's
>> probably the issue.
>>
>> BTW, if anyone knows how Scala can find a reference to the original
>> Guava class even after shading, I'd really like to know. I've looked
>> several times and never found where the original class name is stored.
>>
>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
>> wrote:
>> > Hi Folks,
>> >
>> > Does anybody meet the following issue? I use "mvn package -Phive
>> > -DskipTests” to build the package.
>> >
>> > Thanks.
>> >
>> > Zhan Zhang
>> >
>> >
>> >
>> > bin/spark-shell
>> > ...
>> > Spark context available as sc.
>> > error: error while loading QueryExecution, Missing dependency 'bad
>> symbolic
>> > reference. A signature in QueryExecution.class refers to term
>> annotations
>> > in package com.google.common which is not available.
>> > It may be completely missing from the current classpath, or the version
>> on
>> > the classpath might be incompatible with the version used when compiling
>> > QueryExecution.class.', required by
>> >
>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>> > :10: error: not found: value sqlContext
>> >import sqlContext.implicits._
>> >   ^
>> > :10: error: not found: value sqlContext
>> >import sqlContext.sql
>> >   ^
>>
>>
>>
>> --
>> Marcelo
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Xiu Guo
Hi Zhan:

I hit the exact problem you hit. I rolled back to commit:

de289bf279e14e47859b5fbcd70e97b9d0759f14

which does not have this problem. I suspect something delivered in the past
4 days caused this problem.

On Mon, Nov 9, 2015 at 12:20 PM Ted Yu  wrote:

> I backtracked to:
> ef362846eb448769bcf774fc9090a5013d459464
>
> The issue was still there.
>
> FYI
>
> On Mon, Nov 9, 2015 at 10:46 AM, Ted Yu  wrote:
>
>> Which branch did you perform the build with ?
>>
>> I used the following command yesterday:
>> mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4 -Dhadoop.version=2.7.0
>> package -DskipTests
>>
>> Spark shell was working.
>>
>> Building with latest master branch.
>>
>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
>> wrote:
>>
>>> Hi Folks,
>>>
>>> Does anybody meet the following issue? I use "mvn package -Phive
>>> -DskipTests” to build the package.
>>>
>>> Thanks.
>>>
>>> Zhan Zhang
>>>
>>>
>>>
>>> bin/spark-shell
>>> *...*
>>> Spark context available as sc.
>>> error: error while loading QueryExecution, Missing dependency 'bad
>>> symbolic reference. A signature in QueryExecution.class refers to term
>>> annotations
>>> in package com.google.common which is not available.
>>> It may be completely missing from the current classpath, or the version
>>> on
>>> the classpath might be incompatible with the version used when compiling
>>> QueryExecution.class.', required by
>>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>>> :10: error: not found: value sqlContext
>>>import sqlContext.implicits._
>>>   ^
>>> :10: error: not found: value sqlContext
>>>import sqlContext.sql
>>>   ^
>>>
>>
>>
>


Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Josh Rosen
When we remove this, we should add a style-checker rule to ban the import
so that it doesn't get added back by accident.

On Mon, Nov 9, 2015 at 6:13 PM, Michael Armbrust 
wrote:

> Yeah, we should probably remove that.
>
> On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu  wrote:
>
>> If there is no option to let shell skip processing @VisibleForTesting ,
>> should the annotation be dropped ?
>>
>> Cheers
>>
>> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin 
>> wrote:
>>
>>> We've had this in the past when using "@VisibleForTesting" in classes
>>> that for some reason the shell tries to process. QueryExecution.scala
>>> seems to use that annotation and that was added recently, so that's
>>> probably the issue.
>>>
>>> BTW, if anyone knows how Scala can find a reference to the original
>>> Guava class even after shading, I'd really like to know. I've looked
>>> several times and never found where the original class name is stored.
>>>
>>> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
>>> wrote:
>>> > Hi Folks,
>>> >
>>> > Does anybody meet the following issue? I use "mvn package -Phive
>>> > -DskipTests” to build the package.
>>> >
>>> > Thanks.
>>> >
>>> > Zhan Zhang
>>> >
>>> >
>>> >
>>> > bin/spark-shell
>>> > ...
>>> > Spark context available as sc.
>>> > error: error while loading QueryExecution, Missing dependency 'bad
>>> symbolic
>>> > reference. A signature in QueryExecution.class refers to term
>>> annotations
>>> > in package com.google.common which is not available.
>>> > It may be completely missing from the current classpath, or the
>>> version on
>>> > the classpath might be incompatible with the version used when
>>> compiling
>>> > QueryExecution.class.', required by
>>> >
>>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>>> > :10: error: not found: value sqlContext
>>> >import sqlContext.implicits._
>>> >   ^
>>> > :10: error: not found: value sqlContext
>>> >import sqlContext.sql
>>> >   ^
>>>
>>>
>>>
>>> --
>>> Marcelo
>>>
>>> -
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>


Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Ted Yu
Created https://github.com/apache/spark/pull/9585

Cheers

On Mon, Nov 9, 2015 at 6:39 PM, Josh Rosen  wrote:

> When we remove this, we should add a style-checker rule to ban the import
> so that it doesn't get added back by accident.
>
> On Mon, Nov 9, 2015 at 6:13 PM, Michael Armbrust 
> wrote:
>
>> Yeah, we should probably remove that.
>>
>> On Mon, Nov 9, 2015 at 5:54 PM, Ted Yu  wrote:
>>
>>> If there is no option to let shell skip processing @VisibleForTesting ,
>>> should the annotation be dropped ?
>>>
>>> Cheers
>>>
>>> On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin 
>>> wrote:
>>>
 We've had this in the past when using "@VisibleForTesting" in classes
 that for some reason the shell tries to process. QueryExecution.scala
 seems to use that annotation and that was added recently, so that's
 probably the issue.

 BTW, if anyone knows how Scala can find a reference to the original
 Guava class even after shading, I'd really like to know. I've looked
 several times and never found where the original class name is stored.

 On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
 wrote:
 > Hi Folks,
 >
 > Does anybody meet the following issue? I use "mvn package -Phive
 > -DskipTests” to build the package.
 >
 > Thanks.
 >
 > Zhan Zhang
 >
 >
 >
 > bin/spark-shell
 > ...
 > Spark context available as sc.
 > error: error while loading QueryExecution, Missing dependency 'bad
 symbolic
 > reference. A signature in QueryExecution.class refers to term
 annotations
 > in package com.google.common which is not available.
 > It may be completely missing from the current classpath, or the
 version on
 > the classpath might be incompatible with the version used when
 compiling
 > QueryExecution.class.', required by
 >
 /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
 > :10: error: not found: value sqlContext
 >import sqlContext.implicits._
 >   ^
 > :10: error: not found: value sqlContext
 >import sqlContext.sql
 >   ^



 --
 Marcelo

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


>>>
>>
>


Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Ted Yu
If there is no option to let shell skip processing @VisibleForTesting ,
should the annotation be dropped ?

Cheers

On Mon, Nov 9, 2015 at 5:50 PM, Marcelo Vanzin  wrote:

> We've had this in the past when using "@VisibleForTesting" in classes
> that for some reason the shell tries to process. QueryExecution.scala
> seems to use that annotation and that was added recently, so that's
> probably the issue.
>
> BTW, if anyone knows how Scala can find a reference to the original
> Guava class even after shading, I'd really like to know. I've looked
> several times and never found where the original class name is stored.
>
> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
> wrote:
> > Hi Folks,
> >
> > Does anybody meet the following issue? I use "mvn package -Phive
> > -DskipTests” to build the package.
> >
> > Thanks.
> >
> > Zhan Zhang
> >
> >
> >
> > bin/spark-shell
> > ...
> > Spark context available as sc.
> > error: error while loading QueryExecution, Missing dependency 'bad
> symbolic
> > reference. A signature in QueryExecution.class refers to term annotations
> > in package com.google.common which is not available.
> > It may be completely missing from the current classpath, or the version
> on
> > the classpath might be incompatible with the version used when compiling
> > QueryExecution.class.', required by
> >
> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
> > :10: error: not found: value sqlContext
> >import sqlContext.implicits._
> >   ^
> > :10: error: not found: value sqlContext
> >import sqlContext.sql
> >   ^
>
>
>
> --
> Marcelo
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Zhan Zhang
Thanks Ted. I am using latest master branch. I will try your build command and 
give it a try.

Thank.

Zhan Zhang

On Nov 9, 2015, at 10:46 AM, Ted Yu 
> wrote:

Which branch did you perform the build with ?

I used the following command yesterday:
mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4 -Dhadoop.version=2.7.0 
package -DskipTests

Spark shell was working.

Building with latest master branch.

On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
> wrote:
Hi Folks,

Does anybody meet the following issue? I use "mvn package -Phive -DskipTests” 
to build the package.

Thanks.

Zhan Zhang



bin/spark-shell
...
Spark context available as sc.
error: error while loading QueryExecution, Missing dependency 'bad symbolic 
reference. A signature in QueryExecution.class refers to term annotations
in package com.google.common which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling 
QueryExecution.class.', required by 
/Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
:10: error: not found: value sqlContext
   import sqlContext.implicits._
  ^
:10: error: not found: value sqlContext
   import sqlContext.sql
  ^




Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Ted Yu
Which branch did you perform the build with ?

I used the following command yesterday:
mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4 -Dhadoop.version=2.7.0
package -DskipTests

Spark shell was working.

Building with latest master branch.

On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang  wrote:

> Hi Folks,
>
> Does anybody meet the following issue? I use "mvn package -Phive
> -DskipTests” to build the package.
>
> Thanks.
>
> Zhan Zhang
>
>
>
> bin/spark-shell
> *...*
> Spark context available as sc.
> error: error while loading QueryExecution, Missing dependency 'bad
> symbolic reference. A signature in QueryExecution.class refers to term
> annotations
> in package com.google.common which is not available.
> It may be completely missing from the current classpath, or the version on
> the classpath might be incompatible with the version used when compiling
> QueryExecution.class.', required by
> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
> :10: error: not found: value sqlContext
>import sqlContext.implicits._
>   ^
> :10: error: not found: value sqlContext
>import sqlContext.sql
>   ^
>


Anybody hit this issue in spark shell?

2015-11-09 Thread Zhan Zhang
Hi Folks,

Does anybody meet the following issue? I use "mvn package -Phive -DskipTests” 
to build the package.

Thanks.

Zhan Zhang



bin/spark-shell
...
Spark context available as sc.
error: error while loading QueryExecution, Missing dependency 'bad symbolic 
reference. A signature in QueryExecution.class refers to term annotations
in package com.google.common which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling 
QueryExecution.class.', required by 
/Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
:10: error: not found: value sqlContext
   import sqlContext.implicits._
  ^
:10: error: not found: value sqlContext
   import sqlContext.sql
  ^


Re: Anybody hit this issue in spark shell?

2015-11-09 Thread Ted Yu
I backtracked to:
ef362846eb448769bcf774fc9090a5013d459464

The issue was still there.

FYI

On Mon, Nov 9, 2015 at 10:46 AM, Ted Yu  wrote:

> Which branch did you perform the build with ?
>
> I used the following command yesterday:
> mvn -Phive -Phive-thriftserver -Pyarn -Phadoop-2.4 -Dhadoop.version=2.7.0
> package -DskipTests
>
> Spark shell was working.
>
> Building with latest master branch.
>
> On Mon, Nov 9, 2015 at 10:37 AM, Zhan Zhang 
> wrote:
>
>> Hi Folks,
>>
>> Does anybody meet the following issue? I use "mvn package -Phive
>> -DskipTests” to build the package.
>>
>> Thanks.
>>
>> Zhan Zhang
>>
>>
>>
>> bin/spark-shell
>> *...*
>> Spark context available as sc.
>> error: error while loading QueryExecution, Missing dependency 'bad
>> symbolic reference. A signature in QueryExecution.class refers to term
>> annotations
>> in package com.google.common which is not available.
>> It may be completely missing from the current classpath, or the version on
>> the classpath might be incompatible with the version used when compiling
>> QueryExecution.class.', required by
>> /Users/zzhang/repo/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.2.0.jar(org/apache/spark/sql/execution/QueryExecution.class)
>> :10: error: not found: value sqlContext
>>import sqlContext.implicits._
>>   ^
>> :10: error: not found: value sqlContext
>>import sqlContext.sql
>>   ^
>>
>
>