Hi, I've added Custom Source and Sink in my application jar and found a way to get a static fixed metrics.properties on Stand-alone cluster nodes. When I want to launch my application, I give the static path - spark.metrics.conf="/fixed-path/to/metrics.properties". Despite my custom source/sink being in my code/fat-jar - I get ClassNotFoundException on CustomSink.
My fat-jar (with Custom Source/Sink code in it) is on hdfs with read access to all. So here's what all I've already tried setting (since executors can't find Custom Source/Sink in my application fat-jar): 1. spark.executor.extraClassPath = hdfs://path/to/fat-jar 2. spark.executor.extraClassPath = fat-jar-name.jar 3. spark.executor.extraClassPath = ./fat-jar-name.jar 4. spark.executor.extraClassPath = ./ 5. spark.executor.extraClassPath = /dir/on/cluster/* (although * is not at file level, there are more directories - I have no way of knowing random application-id or driver-id to give absolute name before launching the app) It seems like this is how executors are getting initialized for this case (please correct me if I am wrong) - 1. Driver tells here's the jar location - hdfs://../fat-jar.jar and here are some properties like spark.executor.memory etc. 2. N number of Executors spin up (depending on configuration) on cluster 3. Start downloading hdfs://../fat-jar.jar but initialize metrics system in the mean time (? - not sure of this step) 4. Metrics system looking for Custom Sink/Source files - since it's mentioned in metrics.properties - even before it's done downloading fat-jar (which actually has all those files) (this is my hypothesis) 5. ClassNotFoundException - CustomSink not found! Is my understanding correct? Moreover, is there anything else I can try? If anyone has experience with custom source/sinks, any help would be appreciated. My SO question: https://stackoverflow.com/questions/39763364/metrics-system-not-recognizing-custom-source-sink-in-application-jar Thanks, KP