I'm trying to run a c++ program on spark cluster by using the rdd.pipe()
operation but the executors throw: java.lang.IllegalStateException:
Subprocess exited with status 132.

The spark jar runs totally fine on standalone and the c++ program runs just
fine on its own as well. I've tried with another simple c++ script and there
is no problem with it running on the cluster.

As i understand it the number 132 means Illegal Instruction but i don't know
how to use this to pinpoint the source of this error.

I get no further info by checking the executor logs.I'm posting this here
hoping that someone has a suggestion. I've tried other forums but no luck
yet.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to