Hi,

I ran Spark application in local mode with command:
$SPARK_HOME/bin/spark-submit --driver-memory 1g <class> <jar>
with set master=local.

After around 10 minutes of computing it started to slow down
significantly that next stage took around 50 minutes and next after 5 hours
in 80%
done and CPU usage decreased from 160% to almost 0% (according to system
monitor) where 200% is max for one core. (This stage that took more than 5h
was
saveAsTextFile on 50MB RDD).
During that computing my system was significantly slower and less
responsible.

Moreover when I wanted to interrupt this application I tried ctrl-c and
current stage was interrupted but program didn't exit. When I shut down my
computer system
didn't want to shutdown because killing Spark app failed. (It shut down
after ctrl-alt-del). After reboot system didn't want to start and finally I
had to reinstall it.

The same thing was when I killed next app using kill -9, system was
corrupted and I had to reinstall it.


When I ran this app on a bit smaller data everything was ok. I have Linux
Mint 17, 8GB RAM, Intel Core i7-3630QM (4x2,4GHz).

Do you have any idea why Spark slow down or how to properly kill spark app
run through spark-submit?

Thanks,
Grzegorz

Reply via email to