[ 
https://issues.apache.org/jira/browse/SPARK-1199?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13980756#comment-13980756
 ] 

Piotr Kołaczkowski edited comment on SPARK-1199 at 4/25/14 7:26 AM:
--------------------------------------------------------------------

+1 to fixing this. We're affected as well. Classes defined in Shell are inner 
classes, and therefore cannot be easily instantiated by reflection. They need 
additional reference to the outer object, which is non-trivial to obtain (is it 
obtainable at all without modifying Spark?). 

{noformat}
scala> class Test
defined class Test

scala> new Test
res5: Test = $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Test@4f755864

// good, so there is a default constructor and we can call it through 
reflection?
// not so fast...
scala> classOf[Test].getConstructor()
java.lang.NoSuchMethodException: 
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Test.<init>()
...

scala> classOf[Test].getConstructors()(0)
res7: java.lang.reflect.Constructor[_] = public 
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Test($iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC)
       
{noformat}

The workaround does not work for us.



was (Author: pkolaczk):
+1 to fixing this. We're affected as well. Classes defined in Shell are inner 
classes, and therefore cannot be easily instantiated by reflection. They need 
additional reference to the outer object, which is non-trivial to obtain (is it 
obtainable at all without modifying Spark?). 

{noformat}
scala> class Test
defined class Test

scala> new Test
res5: Test = $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Test@4f755864

// good, so there is a default constructor and we can call it through 
reflection?
// not so fast...
scala> classOf[Test].getConstructor()
java.lang.NoSuchMethodException: 
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Test.<init>()
...

scala> classOf[Test].getConstructors()(0)
res7: java.lang.reflect.Constructor[_] = public 
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Test($iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC)
       
{noformat}


> Type mismatch in Spark shell when using case class defined in shell
> -------------------------------------------------------------------
>
>                 Key: SPARK-1199
>                 URL: https://issues.apache.org/jira/browse/SPARK-1199
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.9.0
>            Reporter: Andrew Kerr
>            Priority: Critical
>             Fix For: 1.1.0
>
>
> Define a class in the shell:
> {code}
> case class TestClass(a:String)
> {code}
> and an RDD
> {code}
> val data = sc.parallelize(Seq("a")).map(TestClass(_))
> {code}
> define a function on it and map over the RDD
> {code}
> def itemFunc(a:TestClass):TestClass = a
> data.map(itemFunc)
> {code}
> Error:
> {code}
> <console>:19: error: type mismatch;
>  found   : TestClass => TestClass
>  required: TestClass => ?
>               data.map(itemFunc)
> {code}
> Similarly with a mapPartitions:
> {code}
> def partitionFunc(a:Iterator[TestClass]):Iterator[TestClass] = a
> data.mapPartitions(partitionFunc)
> {code}
> {code}
> <console>:19: error: type mismatch;
>  found   : Iterator[TestClass] => Iterator[TestClass]
>  required: Iterator[TestClass] => Iterator[?]
> Error occurred in an application involving default arguments.
>               data.mapPartitions(partitionFunc)
> {code}
> The behavior is the same whether in local mode or on a cluster.
> This isn't specific to RDDs. A Scala collection in the Spark shell has the 
> same problem.
> {code}
> scala> Seq(TestClass("foo")).map(itemFunc)
> <console>:15: error: type mismatch;
>  found   : TestClass => TestClass
>  required: TestClass => ?
>               Seq(TestClass("foo")).map(itemFunc)
>                                         ^
> {code}
> When run in the Scala console (not the Spark shell) there are no type 
> mismatch errors.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to