Hey All,
Just an update. Josh, Andrew, and others are working to reproduce
SPARK-4498 and fix it. Other than that issue no serious regressions
have been reported so far. If we are able to get a fix in for that
soon, we'll likely cut another RC with the patch.
Continued testing of RC1 is definitel
Oops my previous response wasn't sent properly to the dev list. Here you go
for archiving.
Yes you can. Scala classes are compiled down to classes in bytecode. Take a
look at this: https://twitter.github.io/scala_school/java.html
Note that questions like this are not exactly what this dev list i
Yes, they are compiled to classes in JVM bytecode just the same. You
may find the generated code from Scala looks a bit strange and uses
Scala-specific classes, but it's certainly possible to treat them like
other Java classes.
On Tue, Dec 2, 2014 at 5:22 AM, Niranda Perera wrote:
> Hi,
>
> Can t
I'm having no problems with the build or zinc on my Mac. I use zinc
from "brew install zinc".
On Tue, Dec 2, 2014 at 3:02 AM, Stephen Boesch wrote:
> Mac as well. Just found the problem: I had created an alias to zinc a
> couple of months back. Apparently that is not happy with the build anymor
Hi,
Can the Scala classes in the spark source code, be inherited (and other OOP
concepts) in Java classes?
I want to customize some part of the code, but I would like to do it in a
Java environment.
Rgds
--
*Niranda Perera*
Software Engineer, WSO2 Inc.
Mobile: +94-71-554-8430
Twitter: @n1r44 <
hello,
im running spark on a cluster and i want to monitor how many nodes/ cores
are active in different (specific) points of the program.
is there any way to do this?
thanks,
Isca
I used the following for brew:
http://repo.typesafe.com/typesafe/zinc/com/typesafe/zinc/dist/0.3.0/zinc-0.3.0.tgz
After starting zinc, I issued the same mvn command but didn't encounter the
error you saw.
FYI
On Mon, Dec 1, 2014 at 8:18 PM, Stephen Boesch wrote:
> The zinc src zip for 0.3.5.3
The zinc src zip for 0.3.5.3 was downloaded and exploded. Then I ran
sbt dist/create . zinc is being launched from
dist/target/zinc-0.3.5.3/bin/zinc
2014-12-01 20:12 GMT-08:00 Ted Yu :
> I use zinc 0.2.0 and started zinc with the same command shown below.
>
> I don't observe such error.
>
>
I use zinc 0.2.0 and started zinc with the same command shown below.
I don't observe such error.
How did you install zinc-0.3.5.3 ?
Cheers
On Mon, Dec 1, 2014 at 8:00 PM, Stephen Boesch wrote:
>
> Anyone maybe can assist on how to run zinc with the latest maven build?
>
> I am starting zinc a
Anyone maybe can assist on how to run zinc with the latest maven build?
I am starting zinc as follows:
/shared/zinc-0.3.5.3/dist/target/zinc-0.3.5.3/bin/zinc -scala-home
$SCALA_HOME -nailed -start
The pertinent env vars are:
19:58:11/lib $echo $SCALA_HOME
/shared/scala
19:58:14/lib $which scal
Mac as well. Just found the problem: I had created an alias to zinc a
couple of months back. Apparently that is not happy with the build anymore.
No problem now that the issue has been isolated - just need to fix my zinc
alias.
2014-12-01 18:55 GMT-08:00 Ted Yu :
> I tried the same command on M
I tried the same command on MacBook and didn't experience the same error.
Which OS are you using ?
Cheers
On Mon, Dec 1, 2014 at 6:42 PM, Stephen Boesch wrote:
> It seems there were some additional settings required to build spark now .
> This should be a snap for most of you ot there about wh
Already tried the solutions they provided.. Did not workout..
On 12/2/14 8:17 AM, Dinesh J. Weerakkody wrote:
Hi Lochana,
can you please go through this mail thread [1]. I haven't tried but
can be useful.
[1]
http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-mav
Hi Lochana,
can you please go through this mail thread [1]. I haven't tried but can be
useful.
[1]
http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-maven-td5615.html
On Mon, Dec 1, 2014 at 4:28 PM, Lochana Menikarachchi
wrote:
> I have spark core and mllib as dep
It seems there were some additional settings required to build spark now .
This should be a snap for most of you ot there about what I am missing.
Here is the command line I have traditionally used:
mvn -Pyarn -Phadoop-2.3 -Phive install compile package -DskipTests
That command line is however
i'll send out a reminder next week, but i wanted to give a heads up: i'll
be bringing down the entire jenkins infrastructure for reboots and system
updates.
please let me know if there are any conflicts with this, thanks!
shane
+0.9 from me. Tested it on Mac and Windows (someone has to do it) and while
things work, I noticed a few recent scripts don't have Windows equivalents,
namely https://issues.apache.org/jira/browse/SPARK-4683 and
https://issues.apache.org/jira/browse/SPARK-4684. The first one at least would
be g
No, it should support any data source that has a schema and can produce
rows.
On Mon, Dec 1, 2014 at 1:34 AM, Niranda Perera wrote:
> Hi Michael,
>
> About this new data source API, what type of data sources would it
> support? Does it have to be RDBMS necessarily?
>
> Cheers
>
> On Sat, Nov 29,
Hi everyone,
There’s an open bug report related to Spark standalone which could be a
potential release-blocker (pending investigation / a bug fix):
https://issues.apache.org/jira/browse/SPARK-4498. This issue seems
non-deterministc and only affects long-running Spark standalone deployments, so
The inaugural Spark Summit East (spark-summit.org/east), an event to bring
the Apache Spark community together, will be in New York City on March 18,
2015. The call for submissions is currently open, but will close this
Friday December 5, at 11:59pm PST. The summit is looking for talks that
will
I have spark core and mllib as dependencies for a spark based osgi
service. When I call the model building method through a unit test
(without osgi) it works OK. When I call it through the osgi service,
nothing happens. I tried adding spark assembly jar. Now it throws
following error..
An err
Hi Michael,
About this new data source API, what type of data sources would it support?
Does it have to be RDBMS necessarily?
Cheers
On Sat, Nov 29, 2014 at 12:57 AM, Michael Armbrust
wrote:
> You probably don't need to create a new kind of SchemaRDD. Instead I'd
> suggest taking a look at th
+1 (non-binding)
built from source
fired up a spark-shell against YARN cluster
ran some jobs using parallelize
ran some jobs that read files
clicked around the web UI
On Sun, Nov 30, 2014 at 1:10 AM, GuoQiang Li wrote:
> +1 (non-binding)
>
>
>
>
> -- Original -
24 matches
Mail list logo