Joe Near created SPARK-9621: ------------------------------- Summary: Closure inside RDD doesn't properly close over environment Key: SPARK-9621 URL: https://issues.apache.org/jira/browse/SPARK-9621 Project: Spark Issue Type: Bug Affects Versions: 1.4.1 Environment: Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package Reporter: Joe Near
I expect the following: case class MyTest(i: Int) val tv = MyTest(1) val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv) to be "true." It is "false," when I type this into spark-shell. It seems the closure is changed somehow when it's serialized and deserialized. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org