Hi All,
Would some expert help me some about the issue...
I shall appreciate you kind help very much!
Thank you!
Zhiliang
On Sunday, September 27, 2015 7:40 PM, Zhiliang Zhu
wrote:
Hi Alexis, Gavin,
Thanks very much for your kind comment.My
It is working, We are doing the same thing everyday. But the remote server
needs to able to talk with ResourceManager.
If you are using Spark-submit, your will also specify the hadoop conf
directory in your Env variable. Spark would rely on that to locate where
the cluster's resource manager
Hi All,
I would like to submit spark job on some another remote machine outside the
cluster,I also copied hadoop/spark conf files under the remote machine, then
hadoopjob would be submitted, but spark job would not.
In spark-env.sh, it may be due to that SPARK_LOCAL_IP is not properly set,or
Hi All,
I would like to submit spark job on some another remote machine outside the
cluster,I also copied hadoop/spark conf files under the remote machine, then
hadoopjob would be submitted, but spark job would not.
In spark-env.sh, it may be due to that SPARK_LOCAL_IP is not properly set,or
Print out your env variables and check first
Sent from my iPhone
> On Sep 25, 2015, at 18:43, Zhiliang Zhu wrote:
>
> Hi All,
>
> I would like to submit spark job on some another remote machine outside the
> cluster,
> I also copied hadoop/spark conf files under
Hi Yue,
Thanks very much for your kind reply.
I would like to submit spark job remotely on another machine outside the
cluster,and the job will run on yarn, similar as hadoop job is already done,
could youconfirm it could exactly work for spark...
Do you mean that I would print those variables