I'm trying to get pyspark running on a shared web host.  I can get into the
pyspark shell but whenever I run a simple command like
sc.parallelize([1,2,3,4]).sum() I get an error that seems to stem from some
kind of permission issue with libsnappyjava.so:

Caused by: java.lang.UnsatisfiedLinkError:
/tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
/tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
failed to map segment from shared object: Operation not permitted

I'm no Linux expert but I suspect it has something to do with noexec maybe
on the /tmp folder?  So I tried setting the TMP, TEMP, and TMPDIR
environment variables to a tmp folder in my own home directory but I get the
same error and it still says /tmp/snappy... not the folder in my my home
directory.  So then I also tried, in pyspark using SparkConf, setting the
spark.local.dir property to my personal tmp folder, and same for the
spark.externalBlockStore.baseDir.  But no matter what, it seems like the
error happens and always refers to /tmp not my personal folder.

Any help would be greatly appreciated.  It all works great on my laptop,
just not on the web host which is a shared linux hosting plan so it doesn't
seem surprising that there would be permission issues with /tmp.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/ibsnappyjava-so-failed-to-map-segment-from-shared-object-tp25937.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to