Hi Andrew,
No I could not figure out the root cause. This seems to be non-deterministic
error... I didn't see same error after rerunning same program. But I noticed
same error on a different program.
First I thought that this may be related to SPARK-2878, but @Graham replied
that this looks
I have both SPARK-2878 and SPARK-2893.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/SPARK-2878-Kryo-serialisation-with-custom-Kryo-registrator-failing-tp7719p8046.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
I am running the code with @rxin's patch in standalone mode. In my case I am
registering org.apache.spark.graphx.GraphKryoRegistrator .
Recently I started to see com.esotericsoftware.kryo.KryoException:
java.io.IOException: failed to uncompress the chunk: PARSING_ERROR . Has
anyone seen this?
the code to generate this bug here:
https://gist.github.com/npanj/92e949d86d08715bf4bf
(I have also filed this jira ticket:
https://issues.apache.org/jira/browse/SPARK-3190)
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Graphx-seems-to-be-broken-while
Hi All,
I am getting PARSING_ERROR while running my job on the code checked out up
to commit# db56f2df1b8027171da1b8d2571d1f2ef1e103b6. I am running this job
on EC2.
Any idea if there is something wrong with my config?
Here is my config:
--
.set(spark.executor.extraJavaOptions,
Hi,
I am trying to set -Xmn to control GC in spark.executor.extraJavaOptions
(as recommended by tuning guide), but I am getting error that
spark.executor.extraJavaOptions is not allowed to alter memory settings.
It seems that extraJavaOptions takes just one number, not list of java
options.
How
/npanj/appcache/application_1401394632504_0131/spark-local-20140603050956-6728/20/shuffle_0_2_97
(No such file or directory)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.init(FileOutputStream.java:221
Thanks Ankur. With your fix I see expected results.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-1-0-outerJoinVertices-seems-to-return-null-for-vertex-attributes-when-input-was-partitioned-and-tp6799p6806.html
Sent from the Apache Spark
I am seeing something strange with outerJoinVertices(and triangle count that
relies on this api):
Here is what I am doing:
1) Created a Graph with multiple partitions i.e created a graph with
minEdgePartitions(in api GraphLoader.edgeListFile), where minEdgePartitions
=1; and use
Correction: in step 4) predicate is ed.srcAttr != null ed.dstAttr != null
(used -1, when when changed attr type to Int )
--
View this message in context:
Hi,
For my project I needed to load a graph with edge weight; for this I have
updated GraphLoader.edgeListFile to consider third column in input file. I
will like to submit my patch for review so that it can be merged with master
branch. What is the process for submitting patches?
--
View
11 matches
Mail list logo