I remember seeing this too, but it seemed to be transient. Try
compiling again. In my case I recall that IJ was still reimporting
some modules when I tried to build. I don't see this error in general.
On Thu, Jan 8, 2015 at 10:38 PM, Bill Bejeck bbej...@gmail.com wrote:
I was having the same
That worked, thx
On Thu, Jan 8, 2015 at 6:17 PM, Sean Owen so...@cloudera.com wrote:
I remember seeing this too, but it seemed to be transient. Try
compiling again. In my case I recall that IJ was still reimporting
some modules when I tried to build. I don't see this error in general.
On
Hi, TD and other streaming developers,
When I look at the implementation of actor-based receiver
(ActorReceiver.scala), I found that there are several messages which are not
mentioned in the document
case props: Props =
val worker = context.actorOf(props)
logInfo(Started receiver worker at:
Side question: Should this section
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-IDESetup
in
the wiki link to Useful Developer Tools
https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools?
On Thu Jan 08 2015 at 6:19:55 PM Sean Owen
Hi Folks,
Apologies for cross posting :(
As some of you may already know, @ApacheCon NA 2015 is happening in Austin,
TX April 13th-16th.
This email is specifically written to attract all folks interested in
Science and Healthcare... this is an official call to arms! I am aware that
there are
Nick - yes. Do you mind moving it? I should have put it in the
Contributing to Spark page.
On Thu, Jan 8, 2015 at 3:22 PM, Nicholas Chammas
nicholas.cham...@gmail.com wrote:
Side question: Should this section
Actually I went ahead and did it.
On Thu, Jan 8, 2015 at 10:25 PM, Patrick Wendell pwend...@gmail.com wrote:
Nick - yes. Do you mind moving it? I should have put it in the
Contributing to Spark page.
On Thu, Jan 8, 2015 at 3:22 PM, Nicholas Chammas
nicholas.cham...@gmail.com wrote:
Side
I believe you're running into an erasure issue which we found in
DecisionTree too. Check out:
https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/tree/RandomForest.scala#L134
That retags RDDs which were created from Java to prevent the exception
you're running
I was having the same issue and that helped. But now I get the following
compilation error when trying to run a test from within Intellij (v 14)
/Users/bbejeck/dev/github_clones/bbejeck-spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/dsl/package.scala
Error:(308, 109) polymorphic
Could one of the admins take a look at PR 3872 (JIRA 3299) submitted on 1/1
Hi devs,
I'd like to ask if anybody has experience with using intellij 14 to step
into spark code. Whatever I try I get compilation error:
Error:scalac: bad option: -P:/home/jakub/.m2/repository/org/scalamacros/
paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar
Project is set up by Patrick's
Hi Andrew,
Patrick Wendell and Andrew Or have committed previous patches related to
Mesos. Maybe they would be good committers to look at it?
RJ
On Mon, Jan 5, 2015 at 6:40 PM, Andrew Ash and...@andrewash.com wrote:
Hi Spark devs,
I'm interested in having a committer look at a PR [1] for
I don't think this makes sense. TD database is standard RDBMS (even parallel)
while Spark is used for non-relational issues.
What could make sense is to deploy Spark on Teradata Aster. Aster is a
database cluster that might call external programs via STREAM operator.
That said Spark/Scala app
Here it is:
[centos] $
/home/jenkins/tools/hudson.tasks.Maven_MavenInstallation/Maven_3.0.5/bin/mvn
-DHADOOP_PROFILE=hadoop-2.4 -Dlabel=centos -DskipTests -Phadoop-2.4
-Pyarn -Phive clean package
You can find the above in
Very interesting approach. Thanks for sharing it!
On Thu, Jan 8, 2015 at 5:30 PM, Enno Shioji eshi...@gmail.com wrote:
FYI I found this approach by Ooyala.
/** Instrumentation for Spark based on accumulators.
*
* Usage:
* val instrumentation = new
Depending on your use cases. If the use case is to extract small amount of
data out of teradata, then you can use the JdbcRDD and soon a jdbc input
source based on the new Spark SQL external data source API.
On Wed, Jan 7, 2015 at 7:14 AM, gen tang gen.tan...@gmail.com wrote:
Hi,
I have a
Thanks for the suggestion, can anyone offer any advice on the ClassCast
Exception going from Java to Scala? Why does JavaRDD.rdd() and then a
collect() result in this exception?
On Thu, Jan 8, 2015 at 4:13 PM, Yana Kadiyska yana.kadiy...@gmail.com
wrote:
How about
Thanks for the suggestion, can anyone offer any advice on the ClassCast
Exception going from Java to Scala? Why does going from JavaRDD.rdd() and
then a collect() result in this exception?
--
View this message in context:
18 matches
Mail list logo