Github user sancyx commented on the issue:

    https://github.com/apache/zeppelin/pull/2637
  
    Hi @naveenkumargp, there was a refactor around the interpreter packaging 
which caused the ClassNotFound problems. Previously there was a big jar 
containing the interpreter class as well, which doesn't exists anymore. We've 
updated the PR, so that all jar files from local repo are enumerated with --jar 
option to spark-submit, which probably is a better approach. 
    With regards to deploy mode: just tried to add `spark.submit.deployMode` 
property via UI and it worked for me,  we do set this by default to `cluster` 
in our custom interpreter.json, as well as the master property to 
`k8s://https://kubernetes:443` however intentionally didn't included in the 
patch, since this is only optional.
    We are using this patch about a half year now, in our deployments and works 
fine for us, would be glad to contribute to community. Please try this latest 
version if works for you.


---

Reply via email to