Hi Michael,I did so, but it's not exactly the problem, you see my driver has
all the dependencies packaged, and only the executors fetch via the
spark.executor.uri the tgz,The strange thing is that I see in my classpath the
org.apache.mesos:mesos-1.0.0-shaded-protobuf dependency packaged in the final
dist of my app…So everything should work in theory.
 





On Tue, Jan 10, 2017 7:22 PM, Michael Gummelt mgumm...@mesosphere.io
wrote:
Just build with -Pmesos 
http://spark.apache.org/docs/latest/building-spark.html#building-with-mesos-support

On Tue, Jan 10, 2017 at 8:56 AM, Olivier Girardot <
o.girar...@lateral-thoughts.com>  wrote:
I had the same problem, added spark-mesos as dependency and now I get :
[2017-01-10 17:45:16,575] {bash_operator.py:77} INFO - Exception in thread
"main" java.lang.NoClassDefFoundError: Could not initialize class
org.apache.mesos.MesosSchedulerDriver[2017-01-10 17:45:16,576]
{bash_operator.py:77} INFO - at org.apache.spark.scheduler.cluster.mesos.
MesosSchedulerUtils$class.createSchedulerDriver(MesosSchedulerUtils.scala:105)
[2017-01-10 17:45:16,576] {bash_operator.py:77} INFO - at
org.apache.spark.scheduler.cluster.mesos.MesosCoarseGrainedSchedulerBac
kend.createSchedulerDriver(MesosCoarseGrainedSchedulerBackend.scala:48)
[2017-01-10 17:45:16,576] {bash_operator.py:77} INFO - at
org.apache.spark.scheduler.cluster.mesos.MesosCoarseGrainedSchedulerBac
kend.start(MesosCoarseGrainedSchedulerBackend.scala:155)[2017-01-10
17:45:16,577] {bash_operator.py:77} INFO - at org.apache.spark.scheduler.
TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)[2017-01-10 17:45:16,577]
{bash_operator.py:77} INFO - at org.apache.spark.SparkContext.
<init>(SparkContext.scala:509)[2017-01-10 17:45:16,577] {bash_operator.py:77}
INFO - at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2313)
[2017-01-10 17:45:16,577] {bash_operator.py:77} INFO - at org.apache.spark.sql.
SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)[2017-01-10
17:45:16,577] {bash_operator.py:77} INFO - at org.apache.spark.sql.
SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)[2017-01-10
17:45:16,578] {bash_operator.py:77} INFO - at scala.Option.getOrElse(Option.
scala:121)[2017-01-10 17:45:16,578] {bash_operator.py:77} INFO - at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
Is there any other dependency to add for spark 2.1.0 ?

 





On Tue, Jan 10, 2017 1:26 AM, Abhishek Bhandari abhi10...@gmail.com
wrote:
Glad that you found it.ᐧ
On Mon, Jan 9, 2017 at 3:29 PM, Richard Siebeling <rsiebel...@gmail.com>  wrote:
Probably found it, it turns out that Mesos should be explicitly added while
building Spark, I assumed I could use the old build command that I used for
building Spark 2.0.0... Didn't see the two lines added in the documentation...
Maybe these kind of changes could be added in the changelog under changes of
behaviour or changes in the build process or something like that,
kind regards,Richard

On 9 January 2017 at 22:55, Richard Siebeling <rsiebel...@gmail.com>  wrote:
Hi,
I'm setting up Apache Spark 2.1.0 on Mesos and I am getting a "Could not parse
Master URL: 'mesos://xx.xx.xxx.xxx:5050'" error.Mesos is running fine (both the
master as the slave, it's a single machine configuration).
I really don't understand why this is happening since the same configuration but
using a Spark 2.0.0 is running fine within Vagrant.Could someone please help?
thanks in advance,Richard






-- 
Abhishek J BhandariMobile No. +1 510 493 6205  (USA)
Mobile No. +91 96387 93021  (IND)R & D DepartmentValent Software Inc. CAEmail: 
abhis...@valent-software.com

 

Olivier Girardot| Associé
o.girar...@lateral-thoughts.com
+33 6 24 09 17 94
 


-- 
Michael Gummelt
Software Engineer
Mesosphere


Olivier Girardot| Associé
o.girar...@lateral-thoughts.com
+33 6 24 09 17 94

Reply via email to