You can just use the Maven build for now, even for Spark 1.0.0.
Matei
On Jun 2, 2014, at 5:30 PM, Mohit Nayak wrote:
> Hey,
> Yup that fixed it. Thanks so much!
>
> Is this the only solution, or could this be resolved in future versions of
> Spark ?
>
>
> On Mon, Jun 2, 2014 at 5:14 PM, Se
Hey,
Yup that fixed it. Thanks so much!
Is this the only solution, or could this be resolved in future versions of
Spark ?
On Mon, Jun 2, 2014 at 5:14 PM, Sean Owen wrote:
> If it's the SBT build, I suspect you are hitting
> https://issues.apache.org/jira/browse/SPARK-1949
>
> Can you try to a
If it's the SBT build, I suspect you are hitting
https://issues.apache.org/jira/browse/SPARK-1949
Can you try to apply the excludes you see at
https://github.com/apache/spark/pull/906/files to your build to see if
it resolves it?
If so I think this could be helpful to commit.
On Tue, Jun 3, 2014
Hey,
Thanks for the reply.
I am using SBT. Here is a list of my dependancies:
val sparkCore= "org.apache.spark" % "spark-core_2.10" % V.spark
val hadoopCore = "org.apache.hadoop" % "hadoop-core" %
V.hadoop% "provided"
val jodaTime = "com.github.nscala-time" %% "
This ultimately means you have a couple copies of the servlet APIs in
the build. What is your build like (SBT? Maven?) and what exactly are
you depending on?
On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak wrote:
> Hi,
> I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a
>
> j
Hi,
I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a
*java.lang.SecurityException: class
"javax.servlet.FilterRegistration"'s signer information does not match
signer information of other classes in the same package*
I'm using Hadoop-core 1.0.4 and running this locally.