Got it, I opened a PR.
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
hange and see what
> the PR builder tests say.
>
> On Tue, Jan 23, 2018 at 4:42 AM Yacine Mazari <y.maz...@gmail.com> wrote:
>
>> Hi All,
>>
>> I am currently working on SPARK-23166
>> <https://issues.apache.org/jira/browse/SPARK-23166> , but aft
Hi All,
I am currently working on SPARK-23166
<https://issues.apache.org/jira/browse/SPARK-23166> , but after running
"./dev/run-tests", the Python unit tests (supposedly unrelated to my change)
are failing for the
park version we use internally by creating a
>> new internal release from the spark master branch. last time i did this was
>> march 7.
>>
>> with this updated spark i am seeing some serialization errors in the unit
>> tests for our own libraries. looks li
ting a
> new internal release from the spark master branch. last time i did this was
> march 7.
>
> with this updated spark i am seeing some serialization errors in the unit
> tests for our own libraries. looks like a scala reflection type that is not
> serializable is getting sucked into
hey all,
today i tried upgrading the spark version we use internally by creating a
new internal release from the spark master branch. last time i did this was
march 7.
with this updated spark i am seeing some serialization errors in the unit
tests for our own libraries. looks like a scala
I could resolve this by passing the argument below
./python/run-tests --python-executables=python2.7
Thanks,
Krishna
On Thu, Nov 3, 2016 at 4:16 PM, Krishna Kalyan <krishnakaly...@gmail.com>
wrote:
> Hello,
> I am trying to run unit tests on pyspark.
>
> When I try to run un
Hello,
I am trying to run unit tests on pyspark.
When I try to run unit test I am faced with errors.
krishna@Krishna:~/Experiment/spark$ ./python/run-tests
Running PySpark tests. Output is in /Users/krishna/Experiment/
spark/python/unit-tests.log
Will test against the following Python executables
Dear Spark developers,
Are there any best practices or guidelines for machine learning unit tests in
Spark? After taking a brief look at the unit tests in ML and MLlib, I have
found that each algorithm is tested in a different way. There are few kinds of
tests:
1)Partial check of internal
Can you submit a pull request for it? Thanks.
On Tue, Jun 2, 2015 at 4:25 AM, Mick Davies michael.belldav...@gmail.com
wrote:
If I write unit tests that indirectly initialize
org.apache.spark.util.Utils,
for example use sql types, but produce no logging, I get the following
unpleasant stack
If I write unit tests that indirectly initialize org.apache.spark.util.Utils,
for example use sql types, but produce no logging, I get the following
unpleasant stack trace in my test output.
This caused by the the Utils class adding a shutdown hook which logs the
message logDebug(Shutdown hook
Thank, Josh, I missed that PR.
On Mon, Feb 9, 2015 at 7:45 PM, Josh Rosen rosenvi...@gmail.com wrote:
Hi Iulian,
I think the AkakUtilsSuite failure that you observed has been fixed in
https://issues.apache.org/jira/browse/SPARK-5548 /
https://github.com/apache/spark/pull/4343
On February
Hi Iulian,
I think the AkakUtilsSuite failure that you observed has been fixed in
https://issues.apache.org/jira/browse/SPARK-5548 /
https://github.com/apache/spark/pull/4343
On February 9, 2015 at 5:47:59 AM, Iulian Dragoș (iulian.dra...@typesafe.com)
wrote:
Hi Patrick,
Thanks for the
Hi Patrick,
Thanks for the heads up. I was trying to set up our own infrastructure for
testing Spark (essentially, running `run-tests` every night) on EC2. I
stumbled upon a number of flaky tests, but none of them look similar to
anything in Jira with the flaky-test tag. I wonder if there's
Hey All,
The tests are in a not-amazing state right now due to a few compounding factors:
1. We've merged a large volume of patches recently.
2. The load on jenkins has been relatively high, exposing races and
other behavior not seen at lower load.
For those not familiar, the main issue is
Ted,
I posted some updates
https://issues.apache.org/jira/browse/SPARK-3431?focusedCommentId=14236540page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-14236540
on
JIRA on my progress (or lack thereof) getting SBT to parallelize test
suites properly. I'm currently stuck
bq. I may move on to trying Maven.
Maven is my favorite :-)
On Sat, Dec 6, 2014 at 10:54 AM, Nicholas Chammas
nicholas.cham...@gmail.com wrote:
Ted,
I posted some updates
@Patrick and Josh actually we went even further than that. We simply
disable the UI for most tests and these used to be the single largest
source of port conflict.
fwiw, when we did this work in HBase, we categorized the tests. Then some
tests can share a single jvm, while some others need to be isolated in
their own jvm. Nevertheless surefire can still run them in parallel by
starting/stopping several jvm.
I think we need to do this as well. Perhaps the
Have you seen this thread http://search-hadoop.com/m/JW1q5xxSAa2 ?
Test categorization in HBase is done through maven-surefire-plugin
Cheers
On Thu, Dec 4, 2014 at 4:05 PM, Nicholas Chammas nicholas.cham...@gmail.com
wrote:
fwiw, when we did this work in HBase, we categorized the tests. Then
” errors when
trying to run Spark Units tests within a CentOS Docker container.
I’m building Spark and running the tests as follows:
# build
sbt/sbt -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Pkinesis-asl
-Phive -Phive-thriftserver package assembly/assembly
# Scala unit tests
sbt/sbt -Pyarn
Here’s that log file https://gist.github.com/nchammas/08d3a3a02486cf602ceb
from a different run of the unit tests that also failed. I’m not sure what
to look for.
If it matters any, I also changed JAVA_OPTS as follows for this run:
export JAVA_OPTS=-Xms512m -Xmx1024m -XX:PermSize=64m
-thriftserver package assembly/assembly
# Scala unit tests
sbt/sbt -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Pkinesis-asl
-Phive -Phive-thriftserver catalyst/test sql/test hive/test mllib/test
The build completes successfully. After humming along for many minutes, the
unit tests fail
Hi All,
When i try to run unit tests that makes use of local-cluster mode (Ex:
Accessing HttpBroadcast variables in a local cluster in
BroadcastSuite.scala), its failing with the below exception. I'm using
java version 1.8.0_05 and scala version 2.10. I tried to look into
the jenkins build
, 2014 at 9:31 PM, Nicholas Chammas
nicholas.cham...@gmail.com wrote:
Howdy,
Do we think it's both feasible and worthwhile to invest in getting our unit
tests to finish in under 5 minutes (or something similarly brief) when run
by Jenkins?
Unit tests currently seem to take anywhere from 30
Howdy,
Do we think it's both feasible and worthwhile to invest in getting our unit
tests to finish in under 5 minutes (or something similarly brief) when run
by Jenkins?
Unit tests currently seem to take anywhere from 30 min to 2 hours. As
people add more tests, I imagine this time will only
A common approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time. But lots of Spark tests are integration-test-like and are
important to run to know
...@cloudera.com wrote:
A common approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time. But lots of Spark tests are integration-test-like and are
important to run
is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time. But lots of Spark tests are integration-test-like and are
important to run to know a change works
-options-and-parallel-execution.html
Cheers
On Fri, Aug 8, 2014 at 9:14 AM, Sean Owen so...@cloudera.com wrote:
A common approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run
...@cloudera.com wrote:
A common approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time. But lots of Spark tests are integration-test-like
wrote:
A common approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time. But lots of Spark tests are integration-test-like
Cheers
On Fri, Aug 8, 2014 at 9:14 AM, Sean Owen so...@cloudera.com wrote:
A common approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time
approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time. But lots of Spark tests are integration-test-like and are
important to run
, Aug 8, 2014 at 9:14 AM, Sean Owen so...@cloudera.com wrote:
A common approach is to separate unit tests from integration tests.
Maven has support for this distinction. I'm not sure it helps a lot
though, since it only helps you to not run integration tests all the
time. But lots
35 matches
Mail list logo