Java heap error

2019-05-22 Thread Kumar sp
Hi , I am getting # java.lang.OutOfMemoryError: Java heap space . I have increased my driver memory and executor memory still i am facing this issue. I am using r4 for driver and core nodes(16). How can we see which step or whether its related to any GC . Can we pin point to single point on code

Re: Java heap error during matrix multiplication

2017-01-26 Thread Burak Yavuz
Hi, Have you tried creating more column blocks? BlockMatrix matrix = cmatrix.toBlockMatrix(100, 100); for example. Is your data randomly spread out, or do you generally have clusters of data points together? On Wed, Jan 25, 2017 at 4:23 AM, Petr Shestov wrote: > Hi

Java heap error during matrix multiplication

2017-01-25 Thread Petr Shestov
Hi all! I'm using Spark 2.0.1 with two workers (one executor each) with 20Gb each. And run following code: JavaRDD entries = ...; // filing the dataCoordinateMatrix cmatrix = new CoordinateMatrix(entries.rdd());BlockMatrix matrix = cmatrix.toBlockMatrix(100, 1000);BlockMatrix cooc =

Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
this email, your message will be added to the discussion >> below: >> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Ja >> va-Heap-Error-tp27669p27707.html >> To unsubscribe from Spark Java Heap Error, click here >> <http://apache-spark-user-list.1001560.n3.

Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
t; below: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark- > Java-Heap-Error-tp27669p27707.html > To unsubscribe from Spark Java Heap Error, click here > <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code=27669=dHIubWFuaX

Re: Spark Java Heap Error

2016-09-13 Thread neil90
o df.cache(StorageLevel.MEMORY_AND_DISK). -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27707.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To uns

Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
> Memory is close to half of 16gb available. > > -- > If you reply to this email, your message will be added to the discussion > below: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark- > Java-Heap-Error-tp27669p27704.html > To unsubscribe

Re: Spark Java Heap Error

2016-09-13 Thread neil90
Double check your Driver Memory in your Spark Web UI make sure the driver Memory is close to half of 16gb available. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27704.html Sent from the Apache Spark User List mailing list

Re: Spark Java Heap Error

2016-09-12 Thread Baktaawar
va:745) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27696.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Spark Java Heap Error

2016-09-09 Thread Baktaawar
avaOptions -XX:+PrintGCDetails -Dkey=value > > You might need to change your spark.driver.maxResultSize settings if you > plan on doing a collect on the entire rdd/dataframe. > > -- > If you reply to this email, your message will be added to the discussion > b

Re: Spark Java Heap Error

2016-09-07 Thread neil90
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value You might need to change your spark.driver.maxResultSize settings if you plan on doing a collect on the entire rdd/dataframe. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error

java heap error

2015-07-15 Thread AlexG
.nabble.com/java-heap-error-tp23856.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org

Re: JavaRDDListTuple2 flatMap Lexicographical Permutations - Java Heap Error

2015-04-30 Thread Dan DeCapria, CivicScience
Thought about it some more, and simplified the problem space for discussions: Given: JavaPairRDDString, Integer c1; // c1.count() == 8000. Goal: JavaPairRDDTuple2String,Integer,Tuple2String,Integer c2; // all lexicographical pairs Where: all lexicographic permutations on c1 ::

Re: JavaRDDListTuple2 flatMap Lexicographical Permutations - Java Heap Error

2015-04-30 Thread Sean Owen
You fundamentally want (half of) the Cartesian product so I don't think it gets a lot faster to form this. You could implement this on cogroup directly and maybe avoid forming the tuples you will filter out. I'd think more about whether you really need to do this thing, or whether there is

JavaRDDListTuple2 flatMap Lexicographical Permutations - Java Heap Error

2015-04-30 Thread Dan DeCapria, CivicScience
I am trying to generate all (N-1)(N)/2 lexicographical 2-tuples from a glom() JavaPairRDDListTuple2. The construction of these initial Tuple2's JavaPairRDDAQ,Integer space is well formed from case classes I provide it (AQ, AQV, AQQ, CT) and is performant; minimized code: SparkConf conf = new