Hi,

Yes, that's possible -- I'm doing it every day in local and standalone modes.

Just use SPARK_PRINT_LAUNCH_COMMAND=1 before any Spark command, i.e.
spark-submit, spark-shell, to know the command to start it:

$ SPARK_PRINT_LAUNCH_COMMAND=1 ./bin/spark-shell

SPARK_PRINT_LAUNCH_COMMAND environment variable controls whether the
Spark launch command is printed out to the standard error output, i.e.
System.err, or not.

Once you've got the command, add the following command-line option to
enable JDWP agent and have it suspended (suspend=y) until a remote
debugging client connects (on port 5005):

-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005

In IntelliJ IDEA, define a new debug configuration for Remote and
press Debug. You're done.

https://www.jetbrains.com/idea/help/debugging-2.html might help.

Pozdrawiam,
Jacek

--
Jacek Laskowski | https://medium.com/@jaceklaskowski/ |
http://blog.jaceklaskowski.pl
Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski


On Sun, Nov 29, 2015 at 5:18 PM, Masf <masfwo...@gmail.com> wrote:
> Hi
>
> Is it possible to debug spark locally with IntelliJ or another IDE?
>
> Thanks
>
> --
> Regards.
> Miguel Ángel

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to