I think I figured it out.  I am playing around with the Cassandra connector
and I had a method that inserted some data into a locally-running Cassandra
instance, but I forgot to close the Cluster object.  I guess that left some
non-daemon thread running and kept the process for exiting.  Nothing to see
here, move along.  :)


On Sat, May 2, 2015 at 2:44 PM Mohammed Guller <moham...@glassbeam.com>
wrote:

>  No, you don’t need to do anything special. Perhaps, your application is
> getting stuck somewhere? If you can share your code, someone may be able to
> help.
>
>
>
> Mohammed
>
>
>
> *From:* James Carman [mailto:ja...@carmanconsulting.com]
> *Sent:* Friday, May 1, 2015 5:53 AM
> *To:* user@spark.apache.org
> *Subject:* Exiting "driver" main() method...
>
>
>
> In all the examples, it seems that the spark application doesn't really do
> anything special in order to exit.  When I run my application, however, the
> spark-submit script just "hangs" there at the end.  Is there something
> special I need to do to get that thing to exit normally?
>

Reply via email to