You should check the active threads in your app. Since your pool uses
non-daemon threads, that will prevent the app from exiting.

spark.stop() should have stopped the Spark jobs in other threads, at
least. But if something is blocking one of those threads, or if
something is creating a non-daemon thread that stays alive somewhere,
you'll see that.

Or you can force quit with sys.exit.

On Tue, Jan 15, 2019 at 1:30 PM Pola Yao <pola....@gmail.com> wrote:
>
> I submitted a Spark job through ./spark-submit command, the code was executed 
> successfully, however, the application got stuck when trying to quit spark.
>
> My code snippet:
> '''
> {
>
> val spark = SparkSession.builder.master(...).getOrCreate
>
> val pool = Executors.newFixedThreadPool(3)
> implicit val xc = ExecutionContext.fromExecutorService(pool)
> val taskList = List(train1, train2, train3)  // where train* is a Future 
> function which wrapped up some data reading and feature engineering and 
> machine learning steps
> val results = Await.result(Future.sequence(taskList), 20 minutes)
>
> println("Shutting down pool and executor service")
> pool.shutdown()
> xc.shutdown()
>
> println("Exiting spark")
> spark.stop()
>
> }
> '''
>
> After I submitted the job, from terminal, I could see the code was executed 
> and printing "Exiting spark", however, after printing that line, it never 
> existed spark, just got stuck.
>
> Does any body know what the reason is? Or how to force quitting?
>
> Thanks!
>
>


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to