seems working
>
>
> Please help
>
>
> Thanks
>
> Sathish
>
> On Thu, Jan 28, 2016 at 6:52 PM Mao Geng <m...@sumologic.com> wrote:
>
>> From my limited knowledge, only limited options such as network mode,
>> volumes, portmaps can be passed through. Se
ny option to pass docker run
> parameters from spark?
> On Thu, Jan 28, 2016 at 12:26 PM Mao Geng <m...@sumologic.com> wrote:
>
>> Sathish,
>>
>> I guess the mesos resources are not enough to run your job. You might
>> want to check the mesos log to fig
n 27, 2016 at 9:58 AM Sathish Kumaran Vairavelu <
> vsathishkuma...@gmail.com> wrote:
>
>> Thanks a lot for your info! I will try this today.
>> On Wed, Jan 27, 2016 at 9:29 AM Mao Geng <m...@sumologic.com> wrote:
>>
>>> Hi Sathish,
>>>
>>> The d
C2
> instance that I am using has the AWS profile/IAM included. Should we build
> the docker image with any AWS profile settings or --net=host docker option
> takes care of it?
>
> Please help
>
>
> Thanks
>
> Sathish
>
>> On Tue, Jan 26, 2016 at 9:04
sSchedulerBackend.scala
but didn't understand it well...
Appreciate if anyone will shed some lights on me.
Thanks,
Mao Geng
>
> Can you try --jars to include those jars?
>
> Best Regards,
>
> Jerry
>
> Sent from my iPhone
>
> On 26 Jan, 2016, at 7:02 pm, Mao Geng <m...@sumologic.com> wrote:
>
> Hi there,
>
> I am trying to run Spark on Mesos using a Docker image as