Hi Akhil and all
My previous code has some problems,all the executors are looping and
running the same command. That's not what I am expecting.previous code:
val shellcompare = List(run,sort.sh)
val shellcompareRDD = sc.makeRDD(shellcompare)
val result = List(aggregate,result)
If you open up the driver UI (running on 4040), you can see multiple tasks
per stage which will be happening concurrently. If it is a single task, and
you want to increase the parallelism, then you can simply do a re-partition.
Thanks
Best Regards
On Tue, May 26, 2015 at 8:27 AM,
Thanks Akhil, I checked the job UI again ,my app is running
concurrently in all the executors. But some of the tasks got I/O exception. I
will continue inspecting on this.
java.io.IOException: Failed to create local dir in
Hi,
You can use pipe operator, if you are running shell script/perl script on
some data. More information on my blog
http://blog.madhukaraphatak.com/pipe-in-spark/.
Regards,
Madhukara Phatak
http://datamantra.io/
On Mon, May 25, 2015 at 8:02 AM, luohui20...@sina.com wrote:
Thanks Akhil,
thanks, madhu and Akhil
I modified my code like below,however I think it is not so distributed. Have
you guys better idea to run this app more efficiantly and distributed?
So I add some comments with my understanding:
import org.apache.spark._
import www.celloud.com.model._
object GeneCompare3 {
Can you can tell us what exactly you are trying to achieve?
Thanks
Best Regards
On Mon, May 25, 2015 at 5:00 PM, luohui20...@sina.com wrote:
thanks, madhu and Akhil
I modified my code like below,however I think it is not so distributed.
Have you guys better idea to run this app more
I am right trying to run some shell script in my spark app, hoping it runs more
concurrently in my spark cluster.However I am not sure whether my codes will
run concurrently in my executors.Dive into my code, you can see that I am
trying to
1.splite both db and sample into 21 small files. That
You mean you want to execute some shell commands from spark? Here's
something i tried a while back. https://github.com/akhld/spark-exploit
Thanks
Best Regards
On Sun, May 24, 2015 at 4:53 PM, luohui20...@sina.com wrote:
hello there
I am trying to run a app in which part of it needs to
Thanks Akhil,
your code is a big help to me,'cause perl script is the exactly
thing i wanna try to run in spark. I will have a try.
Thanksamp;Best regards!
San.Luo
- 原始邮件 -
发件人:Akhil Das ak...@sigmoidanalytics.com
收件人:罗辉