spark-submit does a lot of magic configurations (classpaths etc) underneath
the covers to enable pyspark to find Spark JARs, etc. I am not sure how you
can start running things directly from the PyCharm IDE. Others in the
community may be able to answer. For now the main way to run pyspark stuff
is through spark-submit, or pyspark (which uses spark-submit underneath).

On Fri, Jul 10, 2015 at 6:28 AM, blbradley <bradleytas...@gmail.com> wrote:

> Hello,
>
> I'm trying to debug a PySpark app with Kafka Streaming in PyCharm. However,
> PySpark cannot find the jar dependencies for Kafka Streaming without
> editing
> the program. I can temporarily use SparkConf to set 'spark.jars', but I'm
> using Mesos for production and don't want to edit my program everytime I
> want to debug. I'd like to find a way to debug without editing the source.
>
> Here's what my PyCharm debug execution command looks like:
>
> home/brandon/.pyenv/versions/coinspark/bin/python2.7
> /opt/pycharm-community/helpers/pydev/pydevd.py --multiproc --client
> 127.0.0.1 --port 59042 --file
> /home/brandon/src/coins/coinspark/streaming.py
>
> I might be able to use spark-submit has the command PyCharm runs, but I'm
> not sure if that will work with the debugger.
>
> Thoughts?
>
> Cheers!
> Brandon Bradley
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Debug-Spark-Streaming-in-PyCharm-tp23766.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to