such file or directory: 'spark-submit': 'spark-submit'
I think, airflow try to run spark job in own. How can I configure that it
runs spark code on spark master.
--
Uğur Sopaoğlu
We try to create a cluster which consists of 4 machines. The cluster will
be used by multiple-users. How can we configured that user can submit jobs
from personal computer and is there any free tool you can suggest to
leverage procedure.
--
Uğur Sopaoğlu
Dear Hemant,
I have built spark cluster by using docker container. Can I use apache livy to
submit a job to master node?
hemant singh şunları yazdı (11 Haz 2018 13:55):
> You can explore Livy https://dzone.com/articles/quick-start-with-apache-livy
>
>> On Mon, Jun 11, 2018 at 3:35 PM,
part-0,
part-1. I want to collect all of them into one file.
2017-10-20 16:43 GMT+03:00 Marco Mistroni <mmistr...@gmail.com>:
> Hi
> Could you just create an rdd/df out of what you want to save and store it
> in hdfs?
> Hth
>
> On Oct 20, 2017 9:44 AM, "
Hi all,
In word count example,
val textFile = sc.textFile("Sample.txt")
val counts = textFile.flatMap(line => line.split(" "))
.map(word => (word, 1))
.reduceByKey(_ + _)
counts.saveAsTextFile("hdfs://master:8020/user/abc")
I want to write collection of
Hello,
I have a very easy problem. How I run a spark job, I must copy jar file to
all worker nodes. Is there any way to do simple?.
--
Uğur Sopaoğlu
?
Thanks
--
Uğur Sopaoğlu