You will also need to run 'ldconfig' on each host to read the ld.so.conf file
and make it active. You might also need to stop Spark (the JVM) on each
node to cause the loader to reload for those processes.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/how
I can't seem to instantiate a SparkContext. What am I doing wrong? I tried
using a SparkConf and the 2-string constructor with identical results.
(Note that the project is configured for eclipse in the pom, but I'm
compiling and running on the command line.)
Here's the exception:
~/workspace/Re