Re: Driver Memory taken up by BlockManager

2018-12-14 Thread Davide.Mandrini
Hello, I am facing a similar issue, have you found a solution for that issue? Cheers, Davide -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Programmatically get status of job (WAITING/RUNNING)

2017-11-08 Thread Davide.Mandrini
In this case, the only way to check the status is via REST calls to the Spark json API, accessible at http://:/json/ -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

Re: Programmatically get status of job (WAITING/RUNNING)

2017-11-07 Thread Davide.Mandrini
Hello Behroz, you can use a SparkListener to get updates from the underlying process (c.f. https://spark.apache.org/docs/2.2.0/api/java/org/apache/spark/scheduler/SparkListener.html ) You need first to create your own SparkAppListener class that extends it: -

[Spark Streaming] - Stopped worker throws FileNotFoundException

2017-09-10 Thread Davide.Mandrini
I am running a spark streaming application on a cluster composed by three nodes, each one with a worker and three executors (so a total of 9 executors). I am using the spark standalone mode (version 2.1.1). The application is run with a spark-submit command with option "--deploy-mode" client and

Re: Spark standalone API...

2017-09-10 Thread Davide.Mandrini
Hello, you might get the information you are looking for from this hidden API: http://:/json/ Hope it helps, Davide -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

[Spark Streaming] - Stopped worker throws FileNotFoundException

2017-09-09 Thread Davide.Mandrini
I am running a spark streaming application on a cluster composed by three nodes, each one with a worker and three executors (so a total of 9 executors). I am using the spark standalone mode (version 2.1.1). The application is run with a spark-submit command with option "-deploy-mode client" and

[Spark Streaming] - Stopped worker throws FileNotFoundException

2017-09-09 Thread Davide.Mandrini
I am running a spark streaming application on a cluster composed by three nodes, each one with a worker and three executors (so a total of 9 executors). I am using the spark standalone mode (version 2.1.1). The application is run with a spark-submit command with option "-deploy-mode client" and

Re: Spark standalone API...

2017-09-09 Thread Davide.Mandrini
Hello, you might get the information you are looking for from this hidden API: http://:/json/ Hope it helps, Davide -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

Re: Spark standalone API...

2017-09-09 Thread Davide.Mandrini
Hello, you might get the information you are looking for from this hidden API: http://:/json/ Hope it helps, Davide -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

Re: Spark standalone API...

2017-09-09 Thread Davide.Mandrini
Hello, you might get the information you are looking for from this hidden API: http://:/json/ Hope it helps, Davide -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

Re: Spark standalone API...

2017-09-09 Thread Davide.Mandrini
Hello, you might get the information you are looking for from this hidden API: http://:/json/ Hope it helps, Davide -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

Re: Spark standalone API...

2017-09-09 Thread Davide.Mandrini
Hello, you might get the information you are looking for from this hidden API: http://:/json/ Hope it helps, Davide -- Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/ - To unsubscribe e-mail:

Spark Streaming - Stopped worker throws FileNotFoundException

2017-09-09 Thread Davide.Mandrini
I am running a spark streaming application on a cluster composed by three nodes, each one with a worker and three executors (so a total of 9 executors). I am using the spark standalone mode (version 2.1.1). The application is run with a spark-submit command with option "-deploy-mode client" and

[Spark Streaming] Application is stopped after stopping a worker

2017-08-28 Thread Davide.Mandrini
I am running a spark streaming application on a cluster composed by three nodes, each one with a worker and three executors (so a total of 9 executors). I am using the spark standalone mode. The application is run with a spark-submit command with option --deploy-mode client. The submit command is