For debugging you can refer these two threads

http://apache-spark-user-list.1001560.n3.nabble.com/How-do-you-hit-breakpoints-using-IntelliJ-In-functions-used-by-an-RDD-td12754.html

http://mail-archives.apache.org/mod_mbox/spark-user/201410.mbox/%3ccahuq+_ygfioj2aa3e2zsh7zfsv_z-wsorhvbipahxjlm2fj...@mail.gmail.com%3E

If you put the logs in the map function and run your code in Standalone
mode, then those logs will be in your worker directory, they will not be
displayed in the driver's console.

For Adding the jars while launching spark-shell, you could add those jars
in the SPARK_CLASSPATH in conf/spark-env.sh file or you could say
sc.addJar("/path/to/jar")

Thanks
Best Regards

On Wed, Nov 19, 2014 at 7:58 PM, Mukesh Jha <mukh....@gmail.com> wrote:

> Hello experts,
>
> Is there an easy way to debug a spark java application?
>
> I'm putting debug logs in the map's function but there aren't any logs on
> the console.
>
> Also can i include my custom jars while launching spark-shell and do my
> poc there?
>
> This might me a naive question but any help here is appreciated.
>

Reply via email to