17, 2014 at 1:29 PM, Matt Work Coarr
mattcoarr.w...@gmail.com wrote:
Thanks Marcelo! This is a huge help!!
Looking at the executor logs (in a vanilla spark install, I'm finding
them
in $SPARK_HOME/work/*)...
It launches the executor, but it looks like the
CoarseGrainedExecutorBackend
Thanks Marcelo! This is a huge help!!
Looking at the executor logs (in a vanilla spark install, I'm finding them
in $SPARK_HOME/work/*)...
It launches the executor, but it looks like the
CoarseGrainedExecutorBackend is having trouble talking to the driver
(exactly what you said!!!).
Do you
you tried peeking into its log
file?
(That error is printed whenever the executors fail to report back to
the driver. Insufficient resources to launch the executor is the most
common cause of that, but not the only one.)
On Tue, Jul 15, 2014 at 2:43 PM, Matt Work Coarr
mattcoarr.w
Hello spark folks,
I have a simple spark cluster setup but I can't get jobs to run on it. I
am using the standlone mode.
One master, one slave. Both machines have 32GB ram and 8 cores.
The slave is setup with one worker that has 8 cores and 24GB memory
allocated.
My application requires 2
()
print Spark AMI: + ami
except:
print stderr, Could not resolve AMI at: + ami_path
sys.exit(1)
return ami
Thanks
Best Regards
On Fri, Jun 6, 2014 at 2:14 AM, Matt Work Coarr mattcoarr.w...@gmail.com
wrote:
How would I go about creating a new AMI image that I can
Thanks Akhil! I'll give that a try!
How would I go about creating a new AMI image that I can use with the spark
ec2 commands? I can't seem to find any documentation. I'm looking for a
list of steps that I'd need to perform to make an Amazon Linux image ready
to be used by the spark ec2 tools.
I've been reading through the spark
Hi, I'm attempting to run spark-ec2 launch on AWS. My AWS instances
would be in our EC2 VPC (which seems to be causing a problem).
The two security groups MyClusterName-master and MyClusterName-slaves have
already been setup with the same ports open as the security group that
spark-ec2 tries to