i just turned up a new jenkins slave (amp-jenkins-worker-01) to ensure it
builds properly. these machines have half the ram, same number of
processors and more disk, which will hopefully help us achieve more than
the ~15-20% system utilization we're getting on the current
You might also try out the recently added support for views.
On Mon, Dec 8, 2014 at 9:31 PM, Jianshi Huang jianshi.hu...@gmail.com
wrote:
Ah... I see. Thanks for pointing it out.
Then it means we cannot mount external table using customized column
names. hmm...
Then the only option left is
Patrick, I¹ve nearly completed a basic build out for the SPARK-4501 issue
(at https://github.com/brennonyork/spark/tree/SPARK-4501) and would be
great to get your initial read on it. Per this thread I need to add in the
-scala-home call to zinc, but its close to ready for a PR.
On 12/5/14, 2:10
Hi all – a utility that I’ve found useful several times now when working with
RDDs is to be able to reason about segments of the RDD.
For example, if I have two large RDDs and I want to combine them in a way that
would be intractable in terms of memory or disk storage (e.g. A cartesian) but
a
`zipWithIndex` is both compute intensive and breaks Spark's
transformations are lazy model, so it is probably not appropriate to add
this to the public RDD API. If `zipWithIndex` weren't already what I
consider to be broken, I'd be much friendlier to building something more on
top of it, but I
https://github.com/apache/spark/blob/master/network/shuffle/src/main/java/org/apache/spark/network/shuffle/protocol/BlockTransferMessage.java#L70
public byte[] toByteArray() {
ByteBuf buf = Unpooled.buffer(encodedLength());
buf.writeByte(type().id);
encode(buf);
assert buf.writableBytes()
Yep, will do. The test does catch it -- it's just not being executed.
I think I have a reasonable start on re-enabling surefire + Java tests
for SPARK-4159.
On Tue, Dec 9, 2014 at 10:30 PM, Aaron Davidson aa...@databricks.com wrote:
Oops, that does look like a bug. Strange that the
forgot to install git on this node. /headdesk
i retirggered the failed spark prb jobs.
On Tue, Dec 9, 2014 at 10:49 AM, shane knapp skn...@berkeley.edu wrote:
i just turned up a new jenkins slave (amp-jenkins-worker-01) to ensure it
builds properly. these machines have half the ram, same
So all this time the tests that Jenkins has been running via Jenkins and
SBT + ScalaTest... those haven't been running any of the Java unit tests?
SPARK-4159 https://issues.apache.org/jira/browse/SPARK-4159 only mentions
Maven as a problem, but I'm wondering how these tests got through Jenkins
I'm not so sure about SBT, but I'm looking at the output now and do
not see things like JavaAPISuite being run. I see them compiled. That
I'm not as sure how to fix. I think I have a solution for Maven on
SPARK-4159.
On Tue, Dec 9, 2014 at 11:30 PM, Nicholas Chammas
nicholas.cham...@gmail.com
OK. That's concerning. Hopefully that's the only bug we'll dig up once we
run all the Java tests but who knows.
Patrick,
Shouldn't this be a release blocking bug for 1.2 (mostly just because it
has already been covered by a unit test)? Well, that, as well as any other
bugs that come up as we run
Hey Nick,
Thanks for bringing this up. I believe these Java tests are running in
the sbt build right now, the issue is that this particular bug was
flagged by the triggering of a runtime Java assert (not a normal
Junit test assertion) and those are not enabled in our sbt tests. It
would be good
Oops, yes I see Java tests run with SBT now. You're right, it must be
because of the assertion. I can try to add '-ea' to the SBT build as a
closely-related change for SPARK-4159.
FWIW this error is the only one I saw once the Maven tests ran the Java tests.
On Wed, Dec 10, 2014 at 6:07 AM,
13 matches
Mail list logo