Hi all, question on an issue im having with a vertexRDD. If i kick of my
spark shell with something like this:
then run:
it will finish and give me the count but is see a few errors (see below).
This is okay for this small dataset but when trying with a large data set it
doesnt finish because
Oh forgot to note using the Scala REPL for this.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/graphx-class-not-found-error-tp24253p24254.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
the code below works perfectly on both cluster and local modes
but when i try to create a graph in cluster mode (it works in local mode)
I get the following error:
any help appreciated
--
View this message in context:
Hi im currently using a pregel message passing function for my graph in spark
and graphx. The problem i have is that the code runs perfectly on spark 1.0
and finishes in a couple of minutes but as we have upgraded now im trying to
run the same code on 1.3 but it doesnt finish (left it overnight
Sorry cut and paste error, the resulting data set i want is this:
({(101,S)=3},piece_of_data_1))
({(101,S)=3},piece_of_data_2))
({(101,S)=1},piece_of_data_3))
({(109,S)=2},piece_of_data_3))
--
View this message in context:
Hi all looking for some help in propagating some values in edges. What i want
to achieve (see diagram) is for each connected part of the graph assign an
incrementing value for each of the out links from the root node. This value
will restart again for the next part of the graph. ie node 1 has out
i have and RDD i want to filter and for a single term all works good:
ie
dataRDD.filter(x=x._2 ==apple)
how can i use multiple values, for example if i wanted to filter my rdd to
take out apples and oranges and pears with out using . This could
get long winded as there may be quite a few. Can
Hi all, just wondering if there was a way to extract paths in graphx. For
example if i have the graph attached i would like to return the results
along the lines of :
101 - 103
101 -104 -108
102 -105
102 -106-107
http://apache-spark-user-list.1001560.n3.nabble.com/file/n17936/graph.jpg
--
Hi all, how can i tell if my kryos serializer is actually working. I have a
class which extends Serializable and i have included the following imports:
import com.esotericsoftware.kryo.Kryo
import org.apache.spark.serializer.KryoRegistrator
i also have class included
MyRegistrator extends
Im currently creating a subgraph using the vertex predicate:
subgraph(vpred = (vid,attr) = attr.split(,)(2)!=999)
but wondering if a subgraph can be created using the edge predicate, if so a
sample would be great :)
thanks
Dave
--
View this message in context:
Hi, looking for a little help on counting the degrees in a graph. Currently
my graph consists of 2 subgraphs the and it looks like this:
val vertexArray = Array(
(1L,(101,x)),
(2L,(102,y)),
(3L,(103,y)),
(4L,(104,y)),
(5L,(105,y)),
(6L,(106,x)),
(7L,(107,x)),
(8L,(108,y))
)
val edgeArray =
yes thats correct I want the vertex set for each source vertice in the graph.
Which of course leads me on to my next question is to add a level to each of
these.
http://apache-spark-user-list.1001560.n3.nabble.com/file/n6383/image1.jpg
For example the image shows the in and out links of the
12 matches
Mail list logo