Hi,
I am using Amazon EMR for Spark and we have a 12 node cluster with 1 master and
11 core nodes.
However my cluster isn’t scaling up to capacity.
Would you have suggestions to alter Spark Cluster Settings to utilize my
cluster to its maximum capacity?
Resource Manager:
Hi
I am using Amazon EMR for running Spark and would like to reproduce something
similar to the graph in the end of this link :
https://docs.cloud.databricks.com/docs/latest/featured_notebooks/Wikipedia%20Clickstream%20Data.html
Can someone help me with how to import d3 library in Spark ? I am
Hi
I am trying to understand the best parameter settings for processing a 12.5 GB
file with my Spark Cluster. I am using a 3 node cluster, with 8 cores and 30
Gib of RAM on each node.
I used Cloudera's top 5 mistakes articles and tried the following
configurations:
spark.executor.instances
Hi,
Here is a link that might help -
https://databricks.com/blog/2016/01/25/deep-learning-with-spark-and-tensorflow.html
Thanks
Pooja
From: Nick Pentreath [mailto:nick.pentre...@gmail.com]
Sent: Wednesday, February 24, 2016 1:11 AM
To: user@spark.apache.org
Subject: Re: About Tensor