appreciated.
-Dmitriy
On Thu, Feb 20, 2014 at 7:49 PM, Dmitriy Lyubimov dlie...@gmail.com wrote:
Hello,
I have this weird error coming up that i am really at loss to explain.
I have defined a custom RDD ( reading from a 3rd party store, a really
simple one). In a same code, same test running
ah, no worries. It seems another engineer created multiple spark sessions
which created those weird race conditions.
On Thu, Feb 20, 2014 at 7:57 PM, Dmitriy Lyubimov dlie...@gmail.com wrote:
PS the file it is traing to read from seems to be
./indata1392953916265:0+434205 which seems
On Fri, Jan 3, 2014 at 10:28 AM, Sebastian Schelter s...@apache.org wrote:
I wonder if anyone might have recommendation on scala native
implementation
of SVD.
Mahout has a scala implementation of an SVD variant called Stochastic SVD:
is updated
with svn repo.
Thanks.
Deb
On Fri, Jan 3, 2014 at 10:43 AM, Dmitriy Lyubimov dlie...@gmail.comwrote:
On Fri, Jan 3, 2014 at 10:28 AM, Sebastian Schelter s...@apache.orgwrote:
I wonder if anyone might have recommendation on scala native
implementation
of SVD.
Mahout
.
Again, this is only true for a smaller portion of spark api. Most of spark
api doesn't have this problem so you may well get away with either not
using them or pre-serialize objects into byte arrays while using
problematic api.
Thanks,
Ameet
On Mon, Dec 23, 2013 at 5:18 PM, Dmitriy
I guess Bagel-related questions are ignored, possibly because Bagel is
slated for retirement?
On Tue, Dec 17, 2013 at 10:35 AM, Dmitriy Lyubimov dlie...@gmail.comwrote:
Hello,
i have a quick question:
It just recently occurred to me thtat in Spark group-by is not
shuffle-and-sort
about 200 lines actually.
Matei
On Dec 19, 2013, at 11:46 AM, Dmitriy Lyubimov dlie...@gmail.com wrote:
I guess Bagel-related questions are ignored, possibly because Bagel is
slated for retirement?
On Tue, Dec 17, 2013 at 10:35 AM, Dmitriy Lyubimov dlie...@gmail.comwrote:
Hello,
i have
, no. The driver is a stateful component that is heavy-weight
and should be run inside of the cluster.
On Fri, Nov 22, 2013 at 4:49 PM, Dmitriy Lyubimov dlie...@gmail.com
wrote:
Hello,
as far as i can tell, spark executors use akka to connect back to the
driver.
However, if driver
Hello,
as far as i can tell, spark executors use akka to connect back to the
driver.
However, if driver is behind NAT, it becomes impossible since the tcp
connections are flowing from workers to the driver.
Is there any known way to set up spark clients behind NAT?
thanks. -Dmitriy
contexts in the same JVM. In practice it has always
worked for me in standalone mode and never worked in Mesos mode (mesos
backend deadlocks).
On Wed, Nov 20, 2013 at 1:39 PM, Dmitriy Lyubimov dlie...@gmail.com wrote:
On Wed, Nov 20, 2013 at 12:56 PM, Matt Cheah mch...@palantir.com wrote:
Our use
SparkContexts in one JVM. Thanks!
Mingyu
From: Dmitriy Lyubimov dlie...@gmail.com
Reply-To: user@spark.incubator.apache.org
user@spark.incubator.apache.org
Date: Wednesday, November 20, 2013 at 1:42 PM
To: user@spark.incubator.apache.org user@spark.incubator.apache.org
Subject: Re: Multiple
i think sort by key triggers action. (at least in other tools it does).
On Mon, Sep 23, 2013 at 5:47 PM, Mahdi Namazifar
mahdi.namazi...@gmail.comwrote:
Hi,
I think I might be missing something but here is what I observe which is
inconsistent with my understanding of transformation vs
Although, scratch that.
On Mon, Sep 23, 2013 at 6:17 PM, Dmitriy Lyubimov dlie...@gmail.com wrote:
i think sort by key triggers action. (at least in other tools it does).
On Mon, Sep 23, 2013 at 5:47 PM, Mahdi Namazifar
mahdi.namazi...@gmail.com wrote:
Hi,
I think I might be missing
, Dmitriy Lyubimov dlie...@gmail.com wrote:
I don't know if you can (other than trying to rebuild spark with
_your_ version of netty, nothing really comes to mind) but netty
conflict is a notorious problem (not spark specific at all). Spark
uses akka which uses netty and even if you successfully
14 matches
Mail list logo