Thanks Ted,
that helped me, it turned out that I wrongly formated the name of the
server, I had to add spark:// in front of server name.
Cheers,
Andrejs
On 11/11/15 14:26, Ted Yu wrote:
Please take a look
at launcher/src/test/java/org/apache/spark/launcher/SparkLauncherSuite.java
to see how app.getInputStream() and app.getErrorStream() are handled.
In master branch, the Suite is located
at core/src/test/java/org/apache/spark/launcher/SparkLauncherSuite.java
FYI
On Wed, Nov 11, 2015 at 5:57 AM, Andrejs
<andrejs.ab...@insight-centre.org
<mailto:andrejs.ab...@insight-centre.org>> wrote:
Hi all,
I'm trying to call a python script from a scala application. Below
is part of my code.
My problem is that it doesn't work, but it also doesn't provide
any error message, so I can't debug it.
val spark =new
SparkLauncher().setSparkHome("/home/user/spark-1.4.1-bin-hadoop2.6")
.setAppResource("/home/user/MyCode/forSpark/wordcount.py").addPyFile("/home/andabe/MyCode/forSpark/wordcount.py")
.setMaster("myServerName")
.setAppName("pytho2word")
.launch();
println("finishing")
spark.waitFor();
println("finished")
Any help is appreciated.
Cheers,
Andrejs