That’s a great alternative!









Kind regards,

Radek Gruchalski

ra...@gruchalski.com (mailto:ra...@gruchalski.com)
 
(mailto:ra...@gruchalski.com)
de.linkedin.com/in/radgruchalski/ (http://de.linkedin.com/in/radgruchalski/)

Confidentiality:
This communication is intended for the above-named person and may be 
confidential and/or legally privileged.
If it has come to you in error you must take no action based on it, nor must 
you copy or show it to anyone; please delete/destroy and inform the sender 
immediately.



On Tuesday, 15 March 2016 at 17:19, Timothy Chen wrote:

> You can launch the driver and executor in docker containers as well by 
> setting spark.mesos.executor.docker.image to the image you want to use to 
> launch them.
>  
> Tim
>  
> On Mar 15, 2016, at 8:49 AM, Radoslaw Gruchalski <ra...@gruchalski.com 
> (mailto:ra...@gruchalski.com)> wrote:
>  
> > Pradeep,
> >  
> > You can mount a spark directory as a volume. This means you have to have 
> > spark deployed on every agent.
> >  
> > Another thing you can do, place spark in hdfs, assuming that you have hdfs 
> > available but that too will download a copy to the sandbox.
> >  
> > I'd prefer the former.
> >  
> > Sent from Outlook Mobile (https://aka.ms/qtex0l)
> > _____________________________
> > From: Pradeep Chhetri <pradeep.chhetr...@gmail.com 
> > (mailto:pradeep.chhetr...@gmail.com)>
> > Sent: Tuesday, March 15, 2016 4:41 pm
> > Subject: Apache Spark Over Mesos
> > To: <user@mesos.apache.org (mailto:user@mesos.apache.org)>
> >  
> >  
> > Hello,  
> >  
> > I am able to run Apache Spark over Mesos. Its quite simple to run Spark 
> > Dispatcher over marathon and ask it to run Spark Executor (I guess also can 
> > be called as Spark Driver) as docker container.  
> >  
> > I have a query regarding this:  
> >  
> > All spark tasks are spawned directly by first downloading the spark 
> > artifacts. I was thinking if there is some way I can start them too as 
> > docker containers. This will save the time for downloading the spark 
> > artifacts. I am running spark in fine-grained mode.  
> >  
> > I have attached a screenshot of a sample job  
> >  
> > <Screen Shot 2016-03-15 at 15.15.06.png>  
> > ​  
> > Thanks,  
> >  
> > --  
> > Pradeep Chhetri  
> >  

Reply via email to