I have found a source how to compile spark codes and dynamically load them into distributed executors in spark repl: https://ardoris.wordpress.com/2014/03/30/how-spark-does-class-loading/
If you run spark repl, you can find the spark configuration like this : "spark.repl.class.uri":"spark://xxx:41827/classes" The repl class fetch server will be run to handle the classes compiled by repl spark interpreter with this uri in the spark repl driver. The distributed executors will fetch the classes from the repl class fetch server with the uri of "spark.repl.class.uri" and load them into the classloader in ExecutorClassLoader. I have also researched the spark and zeppeline source codes to use only spark interpreter, but not repl entirely. I have picked up some codes from zeppeline and spark to run spark interpreter in my application. In my application, the embeded http server will be run to handle and interpret the spark codes from the user, the spark codes sent by users will be interpreted dynamically and executed on the distributed executors like spark repl does. It works for now!! For my application, there are some more research to do, for instance, how to handle multiple users with the individual spark session, etc. Cheers, - Kidong Lee. -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org