Hi everyone,
I've been experimenting, and somewhat of a newbie for Spark. I was
wondering, if there is any way, that I can use a custom cluster manager
implementation with Spark. Basically, as I understood, at the moment, the
inbuilt modes supported are with standalone, Mesos and Yarn. My
I am trying to disabling eviction warnings when using sbt, such as these:
[warn] There may be incompatibilities among your library dependencies.[warn]
Here are some of the libraries that were evicted:[warn] *
com.typesafe.sbt:sbt-git:0.6.1 - 0.6.2[warn] *
com.typesafe.sbt:sbt-site:0.7.0
FYI: Spark 1.2.1rc2 does not work on Windows!
On creating a Spark context you get following log output on my Windows
machine:
INFO org.apache.spark.SparkEnv:59 - Registering BlockManagerMaster
ERROR org.apache.spark.util.Utils:75 - Failed to create local root dir in
This looks like a pretty serious problem, thanks! Glad people are testing on
Windows.
Matei
On Jan 31, 2015, at 11:57 AM, MartinWeindel martin.wein...@gmail.com wrote:
FYI: Spark 1.2.1rc2 does not work on Windows!
On creating a Spark context you get following log output on my Windows
Hi,
I'm setting up a dev environment with Intellij IDEA 14. I selected profile
scala-2.10, maven-3, hadoop 2.4, hive, hive 0.13.1. The compilation passed.
But when I try to run LogQuery in examples, I met below issue:
Connected to the target VM, address: '127.0.0.1:37182', transport: 'socket'
Have you read / followed this ?
https://cwiki.apache.org/confluence/display/SPARK
/Useful+Developer+Tools#UsefulDeveloperTools-BuildingSparkinIntelliJIDEA
Cheers
On Sat, Jan 31, 2015 at 8:01 PM, Yafeng Guo daniel.yafeng@gmail.com
wrote:
Hi,
I'm setting up a dev environment with Intellij
Do we have any open JIRA issues to add automated testing on Windows to
Jenkins? I assume that's something we want to do.
On Sat Jan 31 2015 at 10:37:42 PM Matei Zaharia matei.zaha...@gmail.com
wrote:
This looks like a pretty serious problem, thanks! Glad people are testing
on Windows.
Matei