llib/*:credentials) org.xml.sax.SAXParseException;
> lineNumber:
> > 4;
> > > columnNumber: 57; Element type "settings" must be followed by either
> > > attribute specifications, ">" or "/>".
> > >
> > > [error] (cor
mnNumber: 57; Element type "settings" must be followed by either
> > attribute specifications, ">" or "/>".
> >
> > [error] (core/*:credentials) org.xml.sax.SAXParseException; lineNumber:
> 4;
> > columnNumber: 57; Element type &
pecifications, ">" or "/>".
>
> [error] Total time: 0 s, completed Jul 19, 2014 6:09:24 PM
> On Sat, Jul 19, 2014 at 11:02 AM, Debasish Das
> wrote:
>
> > Hi,
> >
> > Is sbt still used for master compilation ? I could compile for
> >
or "/>".
[error] Total time: 0 s, completed Jul 19, 2014 6:09:24 PM
On Sat, Jul 19, 2014 at 11:02 AM, Debasish Das
wrote:
> Hi,
>
> Is sbt still used for master compilation ? I could compile for
> 2.3.0-cdh5.0.2 using maven following the instructions from the website:
&g
Hi,
Is sbt still used for master compilation ? I could compile for
2.3.0-cdh5.0.2 using maven following the instructions from the website:
http://spark.apache.org/docs/latest/building-with-maven.html
But when I am trying to use sbt for local testing and then I am getting
some weird errors...Is
see here for similar issue
http://mail-archives.apache.org/mod_mbox/spark-user/201401.mbox/%3CCALNFXi2hBSyCkPpnBJBYJnPv3dSLNw8VpL_6caEn3yfXCykO=w...@mail.gmail.com%3E
On Apr 6, 2014 4:10 PM, "Sean Owen" wrote:
> scala.None certainly isn't new in 2.10.4; it's ancient :
> http://www.scala-lang.org
scala.None certainly isn't new in 2.10.4; it's ancient :
http://www.scala-lang.org/api/2.10.3/index.html#scala.None$
Surely this is some other problem?
On Sun, Apr 6, 2014 at 6:46 PM, Koert Kuipers wrote:
> also, i thought scala 2.10 was binary compatible, but does not seem to be
> the case. the
i suggest we stick to 2.10.3, since otherwise it seems that (surprisingly)
you force everyone to upgrade
On Sun, Apr 6, 2014 at 1:46 PM, Koert Kuipers wrote:
> also, i thought scala 2.10 was binary compatible, but does not seem to be
> the case. the spark artifacts for scala 2.10.4 dont work fo
also, i thought scala 2.10 was binary compatible, but does not seem to be
the case. the spark artifacts for scala 2.10.4 dont work for me, since we
are still on scala 2.10.3, but when i recompiled and published spark with
scala 2.10.3 everything was fine again.
errors i see:
java.lang.ClassNotFoun
patrick,
this has happened before, that a commit introduced java 7 code/dependencies
and your build didnt fail, i think it was when reynold upgraded to jetty 9.
must be that your entire build infrastructure runs java 7...
On Sat, Apr 5, 2014 at 6:06 PM, Patrick Wendell wrote:
> If you want to s
Yeah spark builds are fine...
For solvers we are planning to use breeze optimization since it has most of
the core functions we will need and we can enhance it further (QP solver
for example)
Right now sparse kmeans in spark mllib uses breeze and that might not even
need this line of codeBut
That's a Breeze question, no? you should not need to compile Breeze
yourself to compile Spark -- why do that?
That method indeed only exists in Java 7. But Breeze seems to target
Java 6 as expected:
https://github.com/scalanlp/breeze/blob/master/build.sbt#L59
I see this particular line of code w
thats confusing. it seems to me the breeze dependency has been compiled
with java 6, since the mllib tests passed fine for me with java 6
On Sun, Apr 6, 2014 at 12:00 PM, Debasish Das wrote:
> Hi Koert,
>
> How do I specify that in sbt ?
>
> Is this the correct way ?
> javacOptions ++= Seq("-t
Hi Koert,
How do I specify that in sbt ?
Is this the correct way ?
javacOptions ++= Seq("-target", "1.6", "-source","1.6")
Breeze project for examples compiles fine with jdk7, fails with jdk6 and
the function it fails on:
error] /home/debasish/github/breeze/
src/main/scala/breeze/util/package
classes compiled with java7 run fine on java6 if you specified "-target
1.6". however if thats the case generally you should also be able to also
then compile it with java 6 just fine.
something compiled with java7 with "-target 1.7" will not run on java 6
On Sat, Apr 5, 2014 at 9:10 PM, Debasi
With jdk7 I could compile it fine:
java version "1.7.0_51"
Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)
What happens if I say take the jar and try to deploy it on ancient centos6
default on cluster ?
java -version
java versi
Will do. I'm just finishing a recompile to check for anything else like this.
The reason is because the tests run with Java 7 (like lots of us do
including me) so it used the Java 7 classpath and found the class.
It's possible to use Java 7 with the Java 6 -bootclasspath. Or just
use Java 6.
--
Se
@patrick our cluster still has java6 deployed...and I compiled using jdk6...
Sean is looking into it...this api is in java7 but not java6...
On Sat, Apr 5, 2014 at 3:06 PM, Patrick Wendell wrote:
> If you want to submit a hot fix for this issue specifically please do. I'm
> not sure why it di
If you want to submit a hot fix for this issue specifically please do. I'm
not sure why it didn't fail our build...
On Sat, Apr 5, 2014 at 2:30 PM, Debasish Das wrote:
> I verified this is happening for both CDH4.5 and 1.0.4...My deploy
> environment is Java 6...so Java 7 compilation is not goin
I verified this is happening for both CDH4.5 and 1.0.4...My deploy
environment is Java 6...so Java 7 compilation is not going to help...
Is this the PR which caused it ?
Andre Schumacher
fbebaedSpark parquet improvements A few improvements to the Parquet
support for SQL queries: - Instea
I can compile with Java 7...let me try that...
On Sat, Apr 5, 2014 at 2:19 PM, Sean Owen wrote:
> That method was added in Java 7. The project is on Java 6, so I think
> this was just an inadvertent error in a recent PR (it was the 'Spark
> parquet improvements' one).
>
> I'll open a hot-fix PR
That method was added in Java 7. The project is on Java 6, so I think
this was just an inadvertent error in a recent PR (it was the 'Spark
parquet improvements' one).
I'll open a hot-fix PR after looking for other stuff like this that
might have snuck in.
--
Sean Owen | Director, Data Science | Lo
I am synced with apache/spark master but getting error in spark/sql
compilation...
Is the master broken ?
[info] Compiling 34 Scala sources to
/home/debasish/spark_deploy/sql/core/target/scala-2.10/classes...
[error]
/home/debasish/spark_deploy/sql/core/src/main/scala/org/apache/spark/sql/parquet
23 matches
Mail list logo