Unsubscribe

2021-08-30 Thread Junior Alvarez



unsubscribe

2019-01-31 Thread Junior Alvarez
unsubscribe


unsubscribe

2019-01-31 Thread Junior Alvarez
Unsubscribe


RE: How to unsubscribe???

2019-01-17 Thread Junior Alvarez
Hi!

Thanks for your info...:-)

Unfortunately, I have never received that email you’re talking about (with 
instructions to double confirm), and it is not in my junk or deleted folders 
neither..

Who is the sender of that second email??? And is there a way for you to send me 
that link, so I can double confirm unsubscribing???


B r
/Junior

From: Trevor News 
Sent: den 16 januari 2019 16:31
To: Junior Alvarez 
Cc: user@spark.apache.org
Subject: Re: How to unsubscribe???

Hi Junior,
After you send an email to 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>, 
you should receive a email with Instructions to double confirm. You will be 
asked to send another email using the link in the 2nd email. Only when that 
step is complete will the unsubscribe take effect.

Please check your Spam or Junk or Clutter folders. Unless we confirm, we will 
continue the subscription.

Hope that helps.

Trevor
Sent from my iPhone

On Jan 16, 2019, at 12:22 AM, Junior Alvarez 
mailto:junior.alva...@ericsson.com>> wrote:
Hi!

I’ve been sending an unsubscribe mail, to this address: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>, 
for the last months, and still…I don’t manage to unsubscribe…Why???

B r
/Junior


How to unsubscribe???

2019-01-16 Thread Junior Alvarez
Hi!

I've been sending an unsubscribe mail, to this address: 
user-unsubscr...@spark.apache.org, 
for the last months, and still...I don't manage to unsubscribe...Why???

B r
/Junior


unsubscribe

2018-12-04 Thread Junior Alvarez



RE: spark.lapply

2018-09-27 Thread Junior Alvarez
Around 500KB each time i call the function (~150 times)

From: Felix Cheung 
Sent: den 26 september 2018 14:57
To: Junior Alvarez ; user@spark.apache.org
Subject: Re: spark.lapply

It looks like the native R process is terminated from buffer overflow. Do you 
know how much data is involved?



From: Junior Alvarez 
mailto:junior.alva...@ericsson.com>>
Sent: Wednesday, September 26, 2018 7:33 AM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: spark.lapply

Hi!

I'm using spark.lapply() in sparkR on a mesos service I get the following crash 
randomly (The spark.lapply() function is called around 150 times, some times it 
crashes after 16 calls, other after 25 calls and so on...it is completely 
random, even though the data used in the actual call is always the same the 150 
times I called that function):

...

18/09/26 07:30:42 INFO TaskSetManager: Finished task 129.0 in stage 78.0 (TID 
1192) in 98 ms on 10.255.0.18 (executor 0) (121/143)

18/09/26 07:30:42 WARN TaskSetManager: Lost task 128.0 in stage 78.0 (TID 1191, 
10.255.0.18, executor 0): org.apache.spark.SparkException: R computation failed 
with

 7f327f4dd000-7f327f50 r-xp  08:11 174916727  
/lib/x86_64-linux-gnu/ld-2.19.so

7f327f51c000-7f327f6f2000 rw-p  00:00 0

7f327f6fc000-7f327f6fd000 rw-p  00:00 0

7f327f6fd000-7f327f6ff000 rw-p  00:00 0

7f327f6ff000-7f327f70 r--p 00022000 08:11 174916727  
/lib/x86_64-linux-gnu/ld-2.19.so

7f327f70-7f327f701000 rw-p 00023000 08:11 174916727  
/lib/x86_64-linux-gnu/ld-2.19.so

7f327f701000-7f327f702000 rw-p  00:00 0

7fff6070f000-7fff60767000 rw-p  00:00 0  [stack]

7fff6077f000-7fff60781000 r-xp  00:00 0  [vdso]

ff60-ff601000 r-xp  00:00 0  
[vsyscall]

*** buffer overflow detected ***: /usr/local/lib/R/bin/exec/R terminated

=== Backtrace: =

/lib/x86_64-linux-gnu/libc.so.6(+0x7329f)[0x7f327db9529f]

/lib/x86_64-linux-gnu/libc.so.6(__fortify_fail+0x5c)[0x7f327dc3087c]

/lib/x86_64-linux-gnu/libc.so.6(+0x10d750)[0x7f327dc2f750]

...

If I of course use the native R lapply() everything works fine.

I wonder if this is a known issue, and/or is there is a way to avoid it when 
using sparkR.

B r
/Junior



spark.lapply

2018-09-26 Thread Junior Alvarez
Hi!

I'm using spark.lapply() in sparkR on a mesos service I get the following crash 
randomly (The spark.lapply() function is called around 150 times, some times it 
crashes after 16 calls, other after 25 calls and so on...it is completely 
random, even though the data used in the actual call is always the same the 150 
times I called that function):

...

18/09/26 07:30:42 INFO TaskSetManager: Finished task 129.0 in stage 78.0 (TID 
1192) in 98 ms on 10.255.0.18 (executor 0) (121/143)

18/09/26 07:30:42 WARN TaskSetManager: Lost task 128.0 in stage 78.0 (TID 1191, 
10.255.0.18, executor 0): org.apache.spark.SparkException: R computation failed 
with

 7f327f4dd000-7f327f50 r-xp  08:11 174916727  
/lib/x86_64-linux-gnu/ld-2.19.so

7f327f51c000-7f327f6f2000 rw-p  00:00 0

7f327f6fc000-7f327f6fd000 rw-p  00:00 0

7f327f6fd000-7f327f6ff000 rw-p  00:00 0

7f327f6ff000-7f327f70 r--p 00022000 08:11 174916727  
/lib/x86_64-linux-gnu/ld-2.19.so

7f327f70-7f327f701000 rw-p 00023000 08:11 174916727  
/lib/x86_64-linux-gnu/ld-2.19.so

7f327f701000-7f327f702000 rw-p  00:00 0

7fff6070f000-7fff60767000 rw-p  00:00 0  [stack]

7fff6077f000-7fff60781000 r-xp  00:00 0  [vdso]

ff60-ff601000 r-xp  00:00 0  
[vsyscall]

*** buffer overflow detected ***: /usr/local/lib/R/bin/exec/R terminated

=== Backtrace: =

/lib/x86_64-linux-gnu/libc.so.6(+0x7329f)[0x7f327db9529f]

/lib/x86_64-linux-gnu/libc.so.6(__fortify_fail+0x5c)[0x7f327dc3087c]

/lib/x86_64-linux-gnu/libc.so.6(+0x10d750)[0x7f327dc2f750]

...

If I of course use the native R lapply() everything works fine.

I wonder if this is a known issue, and/or is there is a way to avoid it when 
using sparkR.

B r
/Junior