Kernel Force created SPARK-40026: ------------------------------------ Summary: spark-shell can not instantiate object whose class is defined in REPL dynamically Key: SPARK-40026 URL: https://issues.apache.org/jira/browse/SPARK-40026 Project: Spark Issue Type: Question Components: Spark Shell Affects Versions: 3.0.3, 2.4.8 Environment: Spark2.3.x ~ Spark3.0.x
Scala2.11.x ~ Scala2.13.x Java 8 Reporter: Kernel Force spark-shell throws {{NoSuchMethodException}} if I define a class in REPL and then call {{newInstance}} via reflection. {code:scala} Spark context available as 'sc' (master = yarn, app id = application_1656488084960_0162). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.0.3 /_/ Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_141) Type in expressions to have them evaluated. Type :help for more information. scala> class Demo { | def demo(s: String): Unit = println(s) | } defined class Demo scala> classOf[Demo].newInstance().demo("OK") java.lang.InstantiationException: Demo at java.lang.Class.newInstance(Class.java:427) ... 47 elided Caused by: java.lang.NoSuchMethodException: Demo.<init>() at java.lang.Class.getConstructor0(Class.java:3082) at java.lang.Class.newInstance(Class.java:412) ... 47 more {code} But the same code works fine in native scala REPL: {code:scala} Welcome to Scala 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131). Type in expressions for evaluation. Or try :help. scala> class Demo { | def demo(s: String): Unit = println(s) | } defined class Demo scala> classOf[Demo].newInstance().demo("OK") OK {code} What's the difference between spark-shell REPL and native scala REPL? I guess the Demo class might be treated as inner class in spark-shell REPL. But ... how to solve the problem? -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org