[ https://issues.apache.org/jira/browse/SPARK-2330?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-2330. ------------------------------------- Resolution: Duplicate Going to close this as a duplicate. We should have a fix for the original issue soon. > Spark shell has weird scala semantics > ------------------------------------- > > Key: SPARK-2330 > URL: https://issues.apache.org/jira/browse/SPARK-2330 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 0.9.1, 1.0.0 > Environment: Ubuntu 14.04 with spark-x.x.x-bin-hadoop2 > Reporter: Andrea Ferretti > Labels: scala, shell > > Normal scala expressions are interpreted in a strange way in the spark shell. > For instance > {noformat} > case class Foo(x: Int) > def print(f: Foo) = f.x > val f = Foo(3) > print(f) > <console>:24: error: type mismatch; > found : Foo > required: Foo > {noformat} > For another example > {noformat} > trait Currency > case object EUR extends Currency > case object USD extends Currency > def nextCurrency: Currency = nextInt(2) match { > case 0 => EUR > case _ => USD > } > <console>:22: error: type mismatch; > found : EUR.type > required: Currency > case 0 => EUR > <console>:24: error: type mismatch; > found : USD.type > required: Currency > case _ => USD > {noformat} -- This message was sent by Atlassian JIRA (v6.2#6252)