Re: Using P4J Plugins with Spark

2020-04-21 Thread Todd Nist
You may want to make sure you include the jar of P4J and your plugins as part of the following so that both the driver and executors have access. If HDFS is out then you could make a common mount point on each of the executor nodes so they have access to the classes. - spark-submit --jars

Using P4J Plugins with Spark

2020-04-21 Thread Shashanka Balakuntala
Hi users, I'm a bit of newbie to spark infrastructure. And i have a small doubt. I have a maven project with plugins generated separately in a folder and normal java command to run is as follows: `java -Dp4j.pluginsDir=./plugins -jar /path/to/jar` Now when I run this program in local with