Re: Spark 2.0 Preview After caching query didn't work and can't kill job.

2016-06-21 Thread Gene Pang
Hi, It looks like this is not related to Alluxio. Have you tried running the same job with different storage? Maybe you could increase the Spark JVM heap size to see if that helps your issue? Hope that helps, Gene On Wed, Jun 15, 2016 at 8:52 PM, Chanh Le wrote: > Hi

Re: Spark 2.0 Preview After caching query didn't work and can't kill job.

2016-06-15 Thread Chanh Le
Hi everyone, I added more logs for my use case: When I cached all my data 500 mil records and count. I receive this. 16/06/16 10:09:25 ERROR TaskSetManager: Total size of serialized results of 27 tasks (1876.7 MB) is bigger than spark.driver.maxResultSize (1024.0 MB) >>> that weird because I

Re: Spark 2.0 Preview After caching query didn't work and can't kill job.

2016-06-15 Thread Chanh Le
Hi Gene, I am using Alluxio 1.1.0. Spark 2.0 Preview version. Load from alluxio then cached and query for 2nd time. Spark will stuck. > On Jun 15, 2016, at 8:42 PM, Gene Pang wrote: > > Hi, > > Which version of Alluxio are you using? > > Thanks, > Gene > > On Tue, Jun

Re: Spark 2.0 Preview After caching query didn't work and can't kill job.

2016-06-15 Thread Gene Pang
Hi, Which version of Alluxio are you using? Thanks, Gene On Tue, Jun 14, 2016 at 3:45 AM, Chanh Le wrote: > I am testing Spark 2.0 > I load data from alluxio and cached then I query but the first query is ok > because it kick off cache action. But after that I run the

Spark 2.0 Preview After caching query didn't work and can't kill job.

2016-06-14 Thread Chanh Le
I am testing Spark 2.0 I load data from alluxio and cached then I query but the first query is ok because it kick off cache action. But after that I run the query again and it’s stuck. I ran in cluster 5 nodes in spark-shell. Did anyone has this issue?