I wanted to ask a basic question about the types of algorithms that are
possible to apply to a DStream with Spark streaming. With Spark it is possible
to perform iterative computations on RDDs like in the gradient descent example
val points = spark.textFile(...).map(parsePoint).cache()
I was wondering if there was any chance of getting a more distributed word2vec
implementation. I seem to be running out of memory from big local data
structures such as
val syn1Global = new Array[Float](vocabSize * vectorSize)
Is there anyway chance of getting a version where these are all
I am working with a RowMatrix and I noticed in the multiply() method that the
local matrix with which it is being multiplied is being distributed to all of
the rows of the RowMatrix. If this is the case, then is it impossible to
multiply a row matrix within a map operation? Because this would
I have a rowMatrix on which I want to perform two multiplications. The first
is a right multiplication with a local matrix which is fine. But after that I
also wish to right multiply the transpose of my rowMatrix with a different
local matrix. I understand that there is no functionality to
able to transpose a rowmatrix. Am
I correct?
Thanks,
Alex
From: Reza Zadeh r...@databricks.com
Sent: Monday, January 12, 2015 1:58 PM
To: Alex Minnaar
Cc: u...@spark.incubator.apache.org
Subject: Re: RowMatrix multiplication
As you mentioned, you can perform
?Good idea! Join each element of c with the corresponding row of A, multiply
through, then reduce. I'll give this a try.
Thanks,
Alex
From: Reza Zadeh r...@databricks.com
Sent: Monday, January 12, 2015 3:05 PM
To: Alex Minnaar
Cc: u
I am trying to do the same thing and also wondering what the best strategy is.
Thanks
From: ll duy.huynh@gmail.com
Sent: Wednesday, December 3, 2014 10:28 AM
To: u...@spark.incubator.apache.org
Subject: what is the best way to implement mini batches?
maybe conflicting versions of Akka. Spark depends
on a version of Akka which is different from that of Scala, and
launching a spark app using Scala command (instead of Java) can cause
issues.
TD
On Thu, Jul 31, 2014 at 6:30 AM, Alex Minnaar
aminn...@verticalscope.com wrote:
I am eager to solve
method.
I have run into this before and resolved it with a SBT clean followed by an
assembly, so maybe you could give that a try.
Let me know if that fixes it,
Andrew
2014-07-29 13:01 GMT-07:00 Alex Minnaar
aminn...@verticalscope.commailto:aminn...@verticalscope.com:
I am trying to run an example
I am trying to run an example Spark standalone app with the following code
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
object SparkGensimLDA extends App{
val ssc=new StreamingContext(local,testApp,Seconds(5))
val
I am trying to run an example Spark standalone app with the following code
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
object SparkGensimLDA extends App{
val ssc=new StreamingContext(local,testApp,Seconds(5))
val
11 matches
Mail list logo