[ 
https://issues.apache.org/jira/browse/SPARK-2330?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14051280#comment-14051280
 ] 

Andrea Ferretti commented on SPARK-2330:
----------------------------------------

I have tried this by pulling from git and I still have the same issues. I am 
not sure why you cannot reproduce it. What do you get when you open a shell and 
paste the lines above?

In any case, it does seem to be a duplicate of the issue you linked.

> Spark shell has weird scala semantics
> -------------------------------------
>
>                 Key: SPARK-2330
>                 URL: https://issues.apache.org/jira/browse/SPARK-2330
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.9.1, 1.0.0
>         Environment: Ubuntu 14.04 with spark-x.x.x-bin-hadoop2
>            Reporter: Andrea Ferretti
>              Labels: scala, shell
>
> Normal scala expressions are interpreted in a strange way in the spark shell. 
> For instance
> {noformat}
> case class Foo(x: Int)
> def print(f: Foo) = f.x
> val f = Foo(3)
> print(f)
> <console>:24: error: type mismatch;
>  found   : Foo
>  required: Foo
> {noformat}
> For another example
> {noformat}
> trait Currency
> case object EUR extends Currency
> case object USD extends Currency
> def nextCurrency: Currency = nextInt(2) match {
>   case 0 => EUR
>   case _ => USD
> }
> <console>:22: error: type mismatch;
>  found   : EUR.type
>  required: Currency
>          case 0 => EUR
> <console>:24: error: type mismatch;
>  found   : USD.type
>  required: Currency
>          case _ => USD
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to