Hello Arun,
Thank you for the descriptive response.
And thank you for providing the sample file too. It certainly is a great
help.
Sincerely,
Ashish
On Mon, Jul 13, 2015 at 10:30 PM, Arun Verma arun.verma...@gmail.com
wrote:
PFA sample file
On Mon, Jul 13, 2015 at 7:37 PM, Arun Verma
Many thanks for your response.
Regards,
Ashish
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774p23797.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
PFA sample file
On Mon, Jul 13, 2015 at 7:37 PM, Arun Verma arun.verma...@gmail.com wrote:
Hi,
Yes it is. To do it follow these steps;
1. cd spark/intallation/path/.../conf
2. cp spark-env.sh.template spark-env.sh
3. vi spark-env.sh
4. SPARK_MASTER_PORT=9000(or any other available port)
Hi,
Yes it is. To do it follow these steps;
1. cd spark/intallation/path/.../conf
2. cp spark-env.sh.template spark-env.sh
3. vi spark-env.sh
4. SPARK_MASTER_PORT=9000(or any other available port)
PFA sample file. I hope this will help.
On Mon, Jul 13, 2015 at 7:24 PM, ashishdutt
Q1: You can change the port number on the master in the file
conf/spark-defaults.conf. I don't know what will be the impact on a cloudera
distro thought.
Q2: Yes: a Spark worker needs to be present on each node which you want to
make available to the driver.
Q3: You can submit an application
Hello all,
In my lab a colleague installed and configured spark 1.3.0 on a 4 noded
cluster on CDH5.4 environment. The default port number for our spark
configuration is 7456. I have been trying to SSH to spark-master from using
this port number but it fails every time giving error JVM is timed
SSH by default should be on port 22. 7456 is the port is where master is
listening. So any spark app should be able to connect to master using that
port.
On 11 Jul 2015 13:50, ashishdutt ashish.du...@gmail.com wrote:
Hello all,
In my lab a colleague installed and configured spark 1.3.0 on a 4