Re: ibsnappyjava.so: failed to map segment from shared object

2016-01-12 Thread Mikey T.
Thanks!  Setting java.io.tmpdir did the trick.  Sadly, I still ran into an
issue with the amount of RAM pyspark was grabbing.  In fact I got a message
from my web provider warning that I was exceeding the memory limit for my
(entry level) account.  So I won't be pursuing it farther.  Oh well, it was
still good to get past that first issue.  Pyspark runs fine on my laptop so
for now I'll just have to use it locally.

- Mike

On Mon, Jan 11, 2016 at 7:20 PM, Josh Rosen 
wrote:

> This is due to the snappy-java library; I think that you'll have to
> configure either java.io.tmpdir or org.xerial.snappy.tempdir; see
> https://github.com/xerial/snappy-java/blob/1198363176ad671d933fdaf0938b8b9e609c0d8a/src/main/java/org/xerial/snappy/SnappyLoader.java#L335
>
>
>
> On Mon, Jan 11, 2016 at 7:12 PM, yatinla  wrote:
>
>> I'm trying to get pyspark running on a shared web host.  I can get into
>> the
>> pyspark shell but whenever I run a simple command like
>> sc.parallelize([1,2,3,4]).sum() I get an error that seems to stem from
>> some
>> kind of permission issue with libsnappyjava.so:
>>
>> Caused by: java.lang.UnsatisfiedLinkError:
>> /tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
>> /tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
>> failed to map segment from shared object: Operation not permitted
>>
>> I'm no Linux expert but I suspect it has something to do with noexec maybe
>> on the /tmp folder?  So I tried setting the TMP, TEMP, and TMPDIR
>> environment variables to a tmp folder in my own home directory but I get
>> the
>> same error and it still says /tmp/snappy... not the folder in my my home
>> directory.  So then I also tried, in pyspark using SparkConf, setting the
>> spark.local.dir property to my personal tmp folder, and same for the
>> spark.externalBlockStore.baseDir.  But no matter what, it seems like the
>> error happens and always refers to /tmp not my personal folder.
>>
>> Any help would be greatly appreciated.  It all works great on my laptop,
>> just not on the web host which is a shared linux hosting plan so it
>> doesn't
>> seem surprising that there would be permission issues with /tmp.
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/ibsnappyjava-so-failed-to-map-segment-from-shared-object-tp25937.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>


Re: ibsnappyjava.so: failed to map segment from shared object

2016-01-11 Thread Josh Rosen
This is due to the snappy-java library; I think that you'll have to
configure either java.io.tmpdir or org.xerial.snappy.tempdir; see
https://github.com/xerial/snappy-java/blob/1198363176ad671d933fdaf0938b8b9e609c0d8a/src/main/java/org/xerial/snappy/SnappyLoader.java#L335



On Mon, Jan 11, 2016 at 7:12 PM, yatinla  wrote:

> I'm trying to get pyspark running on a shared web host.  I can get into the
> pyspark shell but whenever I run a simple command like
> sc.parallelize([1,2,3,4]).sum() I get an error that seems to stem from some
> kind of permission issue with libsnappyjava.so:
>
> Caused by: java.lang.UnsatisfiedLinkError:
> /tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
> /tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
> failed to map segment from shared object: Operation not permitted
>
> I'm no Linux expert but I suspect it has something to do with noexec maybe
> on the /tmp folder?  So I tried setting the TMP, TEMP, and TMPDIR
> environment variables to a tmp folder in my own home directory but I get
> the
> same error and it still says /tmp/snappy... not the folder in my my home
> directory.  So then I also tried, in pyspark using SparkConf, setting the
> spark.local.dir property to my personal tmp folder, and same for the
> spark.externalBlockStore.baseDir.  But no matter what, it seems like the
> error happens and always refers to /tmp not my personal folder.
>
> Any help would be greatly appreciated.  It all works great on my laptop,
> just not on the web host which is a shared linux hosting plan so it doesn't
> seem surprising that there would be permission issues with /tmp.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/ibsnappyjava-so-failed-to-map-segment-from-shared-object-tp25937.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


ibsnappyjava.so: failed to map segment from shared object

2016-01-11 Thread yatinla
I'm trying to get pyspark running on a shared web host.  I can get into the
pyspark shell but whenever I run a simple command like
sc.parallelize([1,2,3,4]).sum() I get an error that seems to stem from some
kind of permission issue with libsnappyjava.so:

Caused by: java.lang.UnsatisfiedLinkError:
/tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
/tmp/snappy-1.1.2-b7abadd6-9b05-4dee-885a-c80434db68e2-libsnappyjava.so:
failed to map segment from shared object: Operation not permitted

I'm no Linux expert but I suspect it has something to do with noexec maybe
on the /tmp folder?  So I tried setting the TMP, TEMP, and TMPDIR
environment variables to a tmp folder in my own home directory but I get the
same error and it still says /tmp/snappy... not the folder in my my home
directory.  So then I also tried, in pyspark using SparkConf, setting the
spark.local.dir property to my personal tmp folder, and same for the
spark.externalBlockStore.baseDir.  But no matter what, it seems like the
error happens and always refers to /tmp not my personal folder.

Any help would be greatly appreciated.  It all works great on my laptop,
just not on the web host which is a shared linux hosting plan so it doesn't
seem surprising that there would be permission issues with /tmp.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/ibsnappyjava-so-failed-to-map-segment-from-shared-object-tp25937.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org