You'll need to build SparkR to match the Spark version deployed on the cluster. You can do that by changing the Spark version in SparkR's build.sbt [1]. If you are using the Maven build you'll need to edit pom.xml
Thanks Shivaram [1] https://github.com/amplab-extras/SparkR-pkg/blob/master/pkg/src/build.sbt#L20 On Mon, Jul 14, 2014 at 6:19 PM, cjwang <c...@cjwang.us> wrote: > I tried installing the latest Spark 1.0.1 and SparkR couldn't find the > master > either. I restarted with Spark 0.9.1 and SparkR was able to find the > master. So, there seemed to be something that changed after Spark 1.0.0. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-failed-to-connect-to-the-master-tp9359p9680.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >