Re: The Dataset unit test is much slower than the RDD unit test (in Scala)

2022-11-01 Thread Cheng Pan
joins), and we have switched > > from RDD to Dataset recently. > > > We've found that the unit test takes much longer. We profiled it and > > have found that it's the planning phase that is slow, not execution. > > > I wonder if anyone has encountered this issue before and

Re: The Dataset unit test is much slower than the RDD unit test (in Scala)

2022-11-01 Thread Enrico Minack
is helps, Enrico Am 25.10.22 um 21:54 schrieb Tanin Na Nakorn: Hi All, Our data job is very complex (e.g. 100+ joins), and we have switched from RDD to Dataset recently. We've found that the unit test takes much longer. We profiled it and have found that it's the planning phase that is s

The Dataset unit test is much slower than the RDD unit test (in Scala)

2022-10-25 Thread Tanin Na Nakorn
Hi All, Our data job is very complex (e.g. 100+ joins), and we have switched from RDD to Dataset recently. We've found that the unit test takes much longer. We profiled it and have found that it's the planning phase that is slow, not execution. I wonder if anyone has encountered this issue

Re: ivy unit test case filing for Spark

2021-12-21 Thread Wes Peng
Are you using IvyVPN which causes this problem? If the VPN software changes the network URL silently you should avoid using them. Regards. On Wed, Dec 22, 2021 at 1:48 AM Pralabh Kumar wrote: > Hi Spark Team > > I am building a spark in VPN . But the unit test case below i

Re: ivy unit test case filing for Spark

2021-12-21 Thread Sean Owen
You would have to make it available? This doesn't seem like a spark issue. On Tue, Dec 21, 2021, 10:48 AM Pralabh Kumar wrote: > Hi Spark Team > > I am building a spark in VPN . But the unit test case below is failing. > This is pointing to ivy location which cannot be reached with

ivy unit test case filing for Spark

2021-12-21 Thread Pralabh Kumar
Hi Spark Team I am building a spark in VPN . But the unit test case below is failing. This is pointing to ivy location which cannot be reached within VPN . Any help would be appreciated test("SPARK-33084: Add jar support Ivy URI -- default transitive = true") { *sc *= new SparkC

Re: Need Unit test complete reference for Pyspark

2020-11-19 Thread Sofia’s World
= spark_session.createDataFrame([['one', 'two']]).toDF(*['first', 'second']) assert df.subtract(df2).count() == 0 On Thu, Nov 19, 2020 at 6:38 AM Sachit Murarka wrote: > Hi Users, > > I have to write Unit Test cases for PySpark. > I think pytest-spark and "spark testing base" are good test lib

Need Unit test complete reference for Pyspark

2020-11-18 Thread Sachit Murarka
Hi Users, I have to write Unit Test cases for PySpark. I think pytest-spark and "spark testing base" are good test libraries. Can anyone please provide full reference for writing the test cases in Python using these? Kind Regards, Sachit Murarka

Re: how do i force unit test to do whole stage codegen

2017-04-05 Thread Jacek Laskowski
x of each operation that the >>> whole-stage codegen can be apply to. >>> >>> So, in your test case, whole-stage codegen has been already enabled!! >>> >>> FYI. I think that it is a good topic for d...@spark.apache.org. >>> >>> Kazuaki

Re: how do i force unit test to do whole stage codegen

2017-04-05 Thread Koert Kuipers
t; FYI. I think that it is a good topic for d...@spark.apache.org. >> >> Kazuaki Ishizaki >> >> >> >> From:Koert Kuipers <ko...@tresata.com> >> To:"user@spark.apache.org" <user@spark.apache.org> >> Date:2017/04

Re: how do i force unit test to do whole stage codegen

2017-04-05 Thread Jacek Laskowski
abled!! > > FYI. I think that it is a good topic for d...@spark.apache.org. > > Kazuaki Ishizaki > > > > From:Koert Kuipers <ko...@tresata.com> > To:"user@spark.apache.org" <user@spark.apache.org> > Date:2017/04/05 05:12 >

Re: how do i force unit test to do whole stage codegen

2017-04-04 Thread Koert Kuipers
"user@spark.apache.org" <user@spark.apache.org> > Date:2017/04/05 05:12 > Subject:how do i force unit test to do whole stage codegen > -- > > > > i wrote my own expression with eval and doGenCode, but doGenCode

Re: how do i force unit test to do whole stage codegen

2017-04-04 Thread Kazuaki Ishizaki
opic for d...@spark.apache.org. Kazuaki Ishizaki From: Koert Kuipers <ko...@tresata.com> To: "user@spark.apache.org" <user@spark.apache.org> Date: 2017/04/05 05:12 Subject:how do i force unit test to do whole stage codegen i wrote my own expression with eval an

how do i force unit test to do whole stage codegen

2017-04-04 Thread Koert Kuipers
i wrote my own expression with eval and doGenCode, but doGenCode never gets called in tests. also as a test i ran this in a unit test: spark.range(10).select('id as 'asId).where('id === 4).explain according to https://jaceklaskowski.gitbooks.io/mastering-apache-spark/spark-sql-whole-stage

Re: How to unit test spark streaming?

2017-03-07 Thread kant kodali
Agreed with the statement in quotes below whether one wants to do unit tests or not It is a good practice to write code that way. But I think the more painful and tedious task is to mock/emulate all the nodes such as spark workers/master/hdfs/input source stream and all that. I wish there is

Re: How to unit test spark streaming?

2017-03-07 Thread Michael Armbrust
> > Basically you abstract your transformations to take in a dataframe and > return one, then you assert on the returned df > +1 to this suggestion. This is why we wanted streaming and batch dataframes to share the same API.

Re: How to unit test spark streaming?

2017-03-07 Thread Jörn Franke
ali <kanth...@gmail.com> wrote: > > Hi All, > > How to unit test spark streaming or spark in general? How do I test the > results of my transformations? Also, more importantly don't we need to spawn > master and worker JVM's either in one or multiple

Re: How to unit test spark streaming?

2017-03-07 Thread Sam Elamin
in a dataframe and return one, then you assert on the returned df Regards Sam On Tue, 7 Mar 2017 at 12:05, kant kodali <kanth...@gmail.com> wrote: > Hi All, > > How to unit test spark streaming or spark in general? How do I test the > results of my transformations? Also, more importa

How to unit test spark streaming?

2017-03-07 Thread kant kodali
Hi All, How to unit test spark streaming or spark in general? How do I test the results of my transformations? Also, more importantly don't we need to spawn master and worker JVM's either in one or multiple nodes? Thanks! kant

Error in run multiple unit test that extends DataFrameSuiteBase

2016-09-23 Thread Jinyuan Zhou
After I created two test case that FlatSpec with DataFrameSuiteBase. But I got errors when do sbt test. I was able to run each of them separately. My test cases does use sqlContext to read files. Here is the exception stack. Judging from the exception, I may need to unregister RpcEndpoint after

RE: How this unit test passed on master trunk?

2016-04-24 Thread Yong Zhang
Subject: Re: How this unit test passed on master trunk? From: zzh...@hortonworks.com To: java8...@hotmail.com; gatorsm...@gmail.com CC: user@spark.apache.org Date: Sun, 24 Apr 2016 04:37:11 + There are multiple records for the DF scala> structDF.groupBy($"a").agg(min(st

Re: How this unit test passed on master trunk?

2016-04-23 Thread Zhan Zhang
struct(1, 2). Please check how the Ordering is implemented in InterpretedOrdering. The output itself does not have any ordering. I am not sure why the unit test and the real env have different environment. Xiao, I do see the difference between unit test and local cluster run. Do you know the reaso

Re: How this unit test passed on master trunk?

2016-04-22 Thread Ted Yu
"))).first() first: org.apache.spark.sql.Row = [1,[1,1]] BTW https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7/715/consoleFull shows this test passing. On Fri, Apr 22, 2016 at 11:23 AM, Yong Zhang <java8...@hotmail.com> wrote: > Hi, > > I was trying to find out why

How this unit test passed on master trunk?

2016-04-22 Thread Yong Zhang
Hi, I was trying to find out why this unit test can pass in Spark code. inhttps://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala for this unit test: test("Star Expansion - CreateStruct and CreateArray") { val structDf = testDa

Re: Unit test with sqlContext

2016-03-19 Thread Vikas Kawadia
bootstrapping for you. >>>> >>>> https://github.com/holdenk/spark-testing-base >>>> >>>> DataFrame examples are here: >>>> https://github.com/holdenk/spark-testing-base/blob/master/src/test/1.3/scala/com/holdenkarau/spark/testing/SampleD

Re: Unit test with sqlContext

2016-02-05 Thread Steve Annessa
-testing-base/blob/master/src/test/1.3/scala/com/holdenkarau/spark/testing/SampleDataFrameTest.scala >>> >>> Thanks, >>> Silvio >>> >>> From: Steve Annessa <steve.anne...@gmail.com> >>> Date: Thursday, February 4, 2016 at 8:36 PM >>>

Unit test with sqlContext

2016-02-04 Thread Steve Annessa
I'm trying to unit test a function that reads in a JSON file, manipulates the DF and then returns a Scala Map. The function has signature: def ingest(dataLocation: String, sc: SparkContext, sqlContext: SQLContext) I've created a bootstrap spec for spark jobs that instantiates the Spark Context

Re: Unit test with sqlContext

2016-02-04 Thread Silvio Fiorito
uot; <user@spark.apache.org<mailto:user@spark.apache.org>> Subject: Unit test with sqlContext I'm trying to unit test a function that reads in a JSON file, manipulates the DF and then returns a Scala Map. The function has signature: def ingest(dataLocation: String, sc: SparkContex

Re: Unit test with sqlContext

2016-02-04 Thread Rishi Mishra
ting-base/blob/master/src/test/1.3/scala/com/holdenkarau/spark/testing/SampleDataFrameTest.scala >> >> Thanks, >> Silvio >> >> From: Steve Annessa <steve.anne...@gmail.com> >> Date: Thursday, February 4, 2016 at 8:36 PM >> To: "user@spark.apac

Re: Unit test with sqlContext

2016-02-04 Thread Holden Karau
e/blob/master/src/test/1.3/scala/com/holdenkarau/spark/testing/SampleDataFrameTest.scala > > Thanks, > Silvio > > From: Steve Annessa <steve.anne...@gmail.com> > Date: Thursday, February 4, 2016 at 8:36 PM > To: "user@spark.apache.org" <user@spark.apache.org&

Re: how to run unit test for specific component only

2015-11-13 Thread Steve Loughran
try: mvn test -pl sql -DwildcardSuites=org.apache.spark.sql -Dtest=none On 12 Nov 2015, at 03:13, weoccc <weo...@gmail.com<mailto:weo...@gmail.com>> wrote: Hi, I am wondering how to run unit test for specific spark component only. mvn test -DwildcardSuites="org.apache.sp

Re: how to run unit test for specific component only

2015-11-11 Thread Ted Yu
Have you tried the following ? build/sbt "sql/test-only *" Cheers On Wed, Nov 11, 2015 at 7:13 PM, weoccc <weo...@gmail.com> wrote: > Hi, > > I am wondering how to run unit test for specific spark component only. > > mvn test -DwildcardSuites="org.apache.sp

how to run unit test for specific component only

2015-11-11 Thread weoccc
Hi, I am wondering how to run unit test for specific spark component only. mvn test -DwildcardSuites="org.apache.spark.sql.*" -Dtest=none The above command doesn't seem to work. I'm using spark 1.5. Thanks, Weide

Re: How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-26 Thread Mike Trienis
Thanks for your response Yana, I can increase the MaxPermSize parameter and it will allow me to run the unit test a few more times before I run out of memory. However, the primary issue is that running the same unit test in the same JVM (multiple times) results in increased memory (each run

Re: How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-26 Thread Michael Armbrust
I'd suggest setting sbt to fork when running tests. On Wed, Aug 26, 2015 at 10:51 AM, Mike Trienis mike.trie...@orcsol.com wrote: Thanks for your response Yana, I can increase the MaxPermSize parameter and it will allow me to run the unit test a few more times before I run out of memory

How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-25 Thread Mike Trienis
Hello, I am using sbt and created a unit test where I create a `HiveContext` and execute some query and then return. Each time I run the unit test the JVM will increase it's memory usage until I get the error: Internal error when running tests: java.lang.OutOfMemoryError: PermGen space Exception

Re: How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-25 Thread Yana Kadiyska
test where I create a `HiveContext` and execute some query and then return. Each time I run the unit test the JVM will increase it's memory usage until I get the error: Internal error when running tests: java.lang.OutOfMemoryError: PermGen space Exception in thread Thread-2

Re: [Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-20 Thread Tathagata Das
: Do you get this failure repeatedly? On Thu, May 14, 2015 at 12:55 AM, kf wangf...@huawei.com wrote: Hi, all, i got following error when i run unit test of spark by dev/run-tests on the latest branch-1.4 branch. the latest commit id: commit d518c0369fa412567855980c3f0f426cde5c190d

Re: [Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-14 Thread Wangfei (X)
error when i run unit test of spark by dev/run-tests on the latest branch-1.4 branch. the latest commit id: commit d518c0369fa412567855980c3f0f426cde5c190d Author: zsxwing zsxw...@gmail.commailto:zsxw...@gmail.com Date: Wed May 13 17:58:29 2015 -0700 error [info] Test

[Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-14 Thread kf
Hi, all, i got following error when i run unit test of spark by dev/run-tests on the latest branch-1.4 branch. the latest commit id: commit d518c0369fa412567855980c3f0f426cde5c190d Author: zsxwing zsxw...@gmail.com Date: Wed May 13 17:58:29 2015 -0700 error [info] Test

Re: [Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-14 Thread Tathagata Das
Do you get this failure repeatedly? On Thu, May 14, 2015 at 12:55 AM, kf wangf...@huawei.com wrote: Hi, all, i got following error when i run unit test of spark by dev/run-tests on the latest branch-1.4 branch. the latest commit id: commit d518c0369fa412567855980c3f0f426cde5c190d Author

Re: Spark unit test fails

2015-05-07 Thread NoWisdom
I'm also getting the same error. Any ideas? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-unit-test-fails-tp22368p22798.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Cannot run unit test.

2015-04-08 Thread Mike Trienis
It's because your tests are running in parallel and you can only have one context running at a time. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-unit-test-tp14459p22429.html Sent from the Apache Spark User List mailing list archive

Re: Spark unit test fails

2015-04-06 Thread Manas Kar
-- View this message in context: Spark unit test fails http://apache-spark-user-list.1001560.n3.nabble.com/Spark-unit-test-fails-tp22368.html Sent from the Apache Spark User List mailing list archive http://apache-spark-user-list.1001560.n3.nabble.com/ at Nabble.com.

Spark unit test fails

2015-04-03 Thread Manas Kar
Hi experts, I am trying to write unit tests for my spark application which fails with javax.servlet.FilterRegistration error. I am using CDH5.3.2 Spark and below is my dependencies list. val spark = 1.2.0-cdh5.3.2 val esriGeometryAPI = 1.2 val csvWriter = 1.0.0

TestSuiteBase based unit test using a sliding window join timesout

2015-01-07 Thread Enno Shioji
Hi, I extended org.apache.spark.streaming.TestSuiteBase for some testing, and I was able to run this test fine: test(Sliding window join with 3 second window duration) { val input1 = Seq( Seq(req1), Seq(req2, req3), Seq(), Seq(req4, req5, req6), Seq(req7),

Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class); logger.warn(!!! DEBUG !!! Spotlight response: {}, response); When run inside a unit test as follows: mvn clean test -Dtest=SpotlightTest#testCountWords it contacts the RESTful web service and retrieves some data as expected

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Sean Owen
(String.class); logger.warn(!!! DEBUG !!! Spotlight response: {}, response); When run inside a unit test as follows: mvn clean test -Dtest=SpotlightTest#testCountWords it contacts the RESTful web service and retrieves some data as expected. But when the same code is run as part

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
On Wed, Dec 24, 2014 at 1:46 PM, Sean Owen so...@cloudera.com wrote: I'd take a look with 'mvn dependency:tree' on your own code first. Maybe you are including JavaEE 6 for example? For reference, my complete pom.xml looks like: project xmlns=http://maven.apache.org/POM/4.0.0; xmlns:xsi=

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
: {}, response); It seems to work when I use spark-submit to submit the application that includes this code. Funny thing is, now my relevant unit test does not run, complaining about not having enough memory: Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0xc490

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Sean Owen
thing is, now my relevant unit test does not run, complaining about not having enough memory: Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0xc490, 25165824, 0) failed; error='Cannot allocate memory' (errno=12) # # There is insufficient memory for the Java

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
() .get(String.class); logger.warn(!!! DEBUG !!! Spotlight response: {}, response); It seems to work when I use spark-submit to submit the application that includes this code. Funny thing is, now my relevant unit test does not run, complaining about not having enough

How can I make Spark Streaming count the words in a file in a unit test?

2014-12-08 Thread Emre Sevinc
to my local Spark, it waits for a file to be written to a given directory, and when I create that file it successfully prints the number of words. I terminate the application by pressing Ctrl+C. Now I've tried to create a very basic unit test for this functionality, but in the test I was not able

Re: How can I make Spark Streaming count the words in a file in a unit test?

2014-12-08 Thread Burak Yavuz
there. Best, Burak - Original Message - From: Emre Sevinc emre.sev...@gmail.com To: user@spark.apache.org Sent: Monday, December 8, 2014 2:36:41 AM Subject: How can I make Spark Streaming count the words in a file in a unit test? Hello, I've successfully built a very simple Spark Streaming

Re: Cannot run unit test.

2014-09-17 Thread Jies
/Cannot-run-unit-test-tp14459p14506.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org

Re: Unit Test for Spark Streaming

2014-08-08 Thread JiajiaJing
.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11825.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h

Re: Unit Test for Spark Streaming

2014-08-06 Thread JiajiaJing
be used to run this test? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11570.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Unit Test for Spark Streaming

2014-08-06 Thread Tathagata Das
Does it not show the name of the testsuite on stdout, showing that it has passed? Can you try writing a small test unit-test, in the same way as your kafka unit test, and with print statements on stdout ... to see whether it works? I believe it is some configuration issue in maven, which is hard

Re: Unit Test for Spark Streaming

2014-08-05 Thread Tathagata Das
when trying to run the KafkaStreamSuite.scala unit test. I added scalatest-maven-plugin to my pom.xml, then ran mvn test, and got the follow error message: error: object Utils in package util cannot be accessed in package org.apache.spark.util [INFO

Unit Test for Spark Streaming

2014-08-04 Thread JiajiaJing
Hello Spark Users, I have a spark streaming program that stream data from kafka topics and output as parquet file on HDFS. Now I want to write a unit test for this program to make sure the output data is correct (i.e not missing any data from kafka). However, I have no idea about how to do

Re: Unit Test for Spark Streaming

2014-08-04 Thread Tathagata Das
Appropriately timed question! Here is the PR that adds a real unit test for Kafka stream in Spark Streaming. Maybe this will help! https://github.com/apache/spark/pull/1751/files On Mon, Aug 4, 2014 at 6:30 PM, JiajiaJing jj.jing0...@gmail.com wrote: Hello Spark Users, I have a spark

Re: Unit Test for Spark Streaming

2014-08-04 Thread JiajiaJing
This helps a lot!! Thank you very much! Jiajia -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11396.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Run spark unit test on Windows 7

2014-07-03 Thread Konstantin Kudryavtsev
(*Windows 7*) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318) at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333

Re: Run spark unit test on Windows 7

2014-07-03 Thread Denny Lee
, I'm trying to run some transformation on Spark, it works fine on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (Windows 7) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries

Re: Run spark unit test on Windows 7

2014-07-03 Thread Kostiantyn Kudriavtsev
to run some transformation on Spark, it works fine on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (Windows 7) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries

Re: Run spark unit test on Windows 7

2014-07-03 Thread Denny Lee
machine (Windows 7) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318) at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333

Run spark unit test on Windows 7

2014-07-02 Thread Konstantin Kudryavtsev
Hi all, I'm trying to run some transformation on *Spark*, it works fine on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (*Windows 7*) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop

Re: Run spark unit test on Windows 7

2014-07-02 Thread Andrew Or
on *Spark*, it works fine on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (*Windows 7*) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries

Re: Run spark unit test on Windows 7

2014-07-02 Thread Konstantin Kudryavtsev
GMT-07:00 Konstantin Kudryavtsev kudryavtsev.konstan...@gmail.com: Hi all, I'm trying to run some transformation on *Spark*, it works fine on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (*Windows 7*) under unit test, I got errors: java.io.IOException

Re: Run spark unit test on Windows 7

2014-07-02 Thread Denny Lee
:00 Konstantin Kudryavtsev kudryavtsev.konstan...@gmail.com: Hi all, I'm trying to run some transformation on Spark, it works fine on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (Windows 7) under unit test, I got errors: java.io.IOException

Re: Run spark unit test on Windows 7

2014-07-02 Thread Kostiantyn Kudriavtsev
on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (Windows 7) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318

Re: Run spark unit test on Windows 7

2014-07-02 Thread Denny Lee
(*Windows 7*) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318) at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333

Re: Unit test failure: Address already in use

2014-06-18 Thread Anselme Vignon
in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-test-failure-Address-already-in-use-tp7771.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

RE: Unit test failure: Address already in use

2014-06-18 Thread Lisonbee, Todd
, Todd From: Anselme Vignon [mailto:anselme.vig...@flaminem.com] Sent: Wednesday, June 18, 2014 12:33 AM To: user@spark.apache.org Subject: Re: Unit test failure: Address already in use Hi, Could your problem come from the fact that you run your tests in parallel ? If you are spark in local mode

Re: Unit test failure: Address already in use

2014-06-18 Thread Philip Ogren
@spark.apache.org *Subject:* Re: Unit test failure: Address already in use Hi, Could your problem come from the fact that you run your tests in parallel ? If you are spark in local mode, you cannot have concurrent spark instances running. this means that your tests instantiating sparkContext cannot be run

Unit test failure: Address already in use

2014-06-17 Thread SK
) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77) thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-test-failure-Address-already-in-use-tp7771.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

printing in unit test

2014-06-13 Thread SK
Hi, My unit test is failing (the output is not matching the expected output). I would like to printout the value of the output. But rdd.foreach(r=println(r)) does not work from the unit test. How can I print or write out the output to a file/screen? thanks. -- View this message in context

unit test

2014-06-06 Thread b0c1
) - Elasticsearch = Spark (map/reduce) - HBase 2. Can Spark read data from elasticsearch? What is the prefered way for this? b0c1 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/unit-test-tp7155.html Sent from the Apache Spark User List mailing list archive