I am evaluating Spark for our production usage. Our production cluster is
Hadoop 2.2.0 without Yarn. So I want to test Spark with Standalone deployment
running with Hadoop.
What I have in mind is to test a very complex Hive query, which joins between 6
tables, lots of nested structure with
: java8...@hotmail.com
To: user@spark.apache.org
Subject: My first experience with Spark
Date: Thu, 5 Feb 2015 16:03:33 -0500
I am evaluating Spark for our production usage. Our production cluster is
Hadoop 2.2.0 without Yarn. So I want to test Spark with Standalone deployment
running with Hadoop
using more time.We
plan to make spark coexist with Hadoop cluster, so be able to control its
memory usage is important for us.Does spark need that much of memory?ThanksYong
Date: Thu, 5 Feb 2015 15:36:48 -0800
Subject: Re: My first experience with Spark
From: deborah.sie...@gmail.com
To: java8