Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
@Sean Owen, Thanks for your advice.
 There are still some failing tests on my laptop. I will work on this 
issue(file move) as soon as I figure out other test related issues.


-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 2:41 PM, Sean Owen wrote:

> Good call -- indeed that same Files class has a move() method that
> will try to use renameTo() and then fall back to copy() and delete()
> if needed for this very reason.
> 
> 
> On Tue, Apr 15, 2014 at 6:34 AM, Ye Xianjin  (mailto:advance...@gmail.com)> wrote:
> > Hi, I think I have found the cause of the tests failing.
> > 
> > I have two disks on my laptop. The spark project dir is on an HDD disk 
> > while the tempdir created by google.io.Files.createTempDir is the 
> > /var/folders/5q/ ,which is on the system disk, an SSD.
> > The ExecutorLoaderSuite test uses 
> > org.apache.spark.TestUtils.createdCompiledClass methods.
> > The createCompiledClass method first generates the compiled class in the 
> > pwd(spark/repl), thens use renameTo to move
> > the file. The renameTo method fails because the dest file is in a different 
> > filesystem than the source file.
> > 
> > I modify the TestUtils.scala to first copy the file to dest then delete the 
> > original file. The tests go smoothly.
> > Should I issue an jira about this problem? Then I can send a pr on Github.
> > 
> 
> 
> 




Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Sean Owen
Good call -- indeed that same Files class has a move() method that
will try to use renameTo() and then fall back to copy() and delete()
if needed for this very reason.


On Tue, Apr 15, 2014 at 6:34 AM, Ye Xianjin  wrote:
> Hi, I think I have found the cause of the tests failing.
>
> I have two disks on my laptop. The spark project dir is on an HDD disk while 
> the tempdir created by google.io.Files.createTempDir is the 
> /var/folders/5q/ ,which is on the system disk, an SSD.
> The ExecutorLoaderSuite test uses 
> org.apache.spark.TestUtils.createdCompiledClass methods.
> The createCompiledClass method first generates the compiled class in the 
> pwd(spark/repl), thens use renameTo to move
> the file. The renameTo method fails because the dest file is in a different 
> filesystem than the source file.
>
> I modify the TestUtils.scala to first copy the file to dest then delete the 
> original file. The tests go smoothly.
> Should I issue an jira about this problem? Then I can send a pr on Github.


Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Aaron Davidson
By all means, it would be greatly appreciated!


On Mon, Apr 14, 2014 at 10:34 PM, Ye Xianjin  wrote:

> Hi, I think I have found the cause of the tests failing.
>
> I have two disks on my laptop. The spark project dir is on an HDD disk
> while the tempdir created by google.io.Files.createTempDir is the
> /var/folders/5q/ ,which is on the system disk, an SSD.
> The ExecutorLoaderSuite test uses
> org.apache.spark.TestUtils.createdCompiledClass methods.
> The createCompiledClass method first generates the compiled class in the
> pwd(spark/repl), thens use renameTo to move
> the file. The renameTo method fails because the dest file is in a
> different filesystem than the source file.
>
> I modify the TestUtils.scala to first copy the file to dest then delete
> the original file. The tests go smoothly.
> Should I issue an jira about this problem? Then I can send a pr on Github.
>
> --
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>
> On Tuesday, April 15, 2014 at 3:43 AM, Ye Xianjin wrote:
>
> > well. This is very strange.
> > I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and
> made small changes to ExecutorClassLoaderSuite.scala (mostly output some
> internal variables). After that, when running repl test, I noticed the
> ReplSuite
> > was tested first and the test result is ok. But the
> ExecutorClassLoaderSuite test was weird.
> > Here is the output:
> > [info] ExecutorClassLoaderSuite:
> > [error] Uncaught exception when running
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
> PermGen space
> > [error] Uncaught exception when running
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
> PermGen space
> > Internal error when running tests: java.lang.OutOfMemoryError: PermGen
> space
> > Exception in thread "Thread-3" java.io.EOFException
> > at
> java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
> > at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
> > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
> > at sbt.React.react(ForkTests.scala:116)
> > at
> sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
> > at java.lang.Thread.run(Thread.java:695)
> >
> >
> > I revert my changes. The test result is same.
> >
> >  I touched the ReplSuite.scala file (use touch command), the test order
> is reversed, same as the very beginning. And the output is also the
> same.(The result in my first post).
> >
> >
> > --
> > Ye Xianjin
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> >
> >
> > On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:
> >
> > > This may have something to do with running the tests on a Mac, as
> there is
> > > a lot of File/URI/URL stuff going on in that test which may just have
> > > happened to work if run on a Linux system (like Jenkins). Note that
> this
> > > suite was added relatively recently:
> > > https://github.com/apache/spark/pull/217
> > >
> > >
> > > On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  advance...@gmail.com)> wrote:
> > >
> > > > Thank you for your reply.
> > > >
> > > > After building the assembly jar, the repl test still failed. The
> error
> > > > output is same as I post before.
> > > >
> > > > --
> > > > Ye Xianjin
> > > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > >
> > > >
> > > > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > > >
> > > > > I believe you may need an assembly jar to run the ReplSuite.
> "sbt/sbt
> > > > > assembly/assembly".
> > > > >
> > > > > Michael
> > > > >
> > > > >
> > > > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin 
> > > > >  advance...@gmail.com)(mailto:
> > > > advance...@gmail.com (mailto:advance...@gmail.com))> wrote:
> > > > >
> > > > > > Hi, everyone:
> > > > > > I am new to Spark development. I download spark's latest code
> from
> > > > > >
> > > > >
> > > > >
> > > >
> > > > github.
> > > > > > After running sbt/sbt assembly,
> > > > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > > >
> > > > >
> > > >
> > > > failed
> > > > > > running the repl module test.
> > > > > >
> > > > > > Here are some output details.
> > > > > >
> > > > > > command:
> > > > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > > > output:
> > > > > >
> > > > > > [info] Loading project definition from
> > > > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > > > [info] Loading project definition from
> > > > > > /Volumes/MacintoshHD/github/spark/project
> > > > > > [info] Set current project to root (in build
> > > > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for graphx/test:testOnly
> > > > > > [info] Passed: Tota

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
Hi, I think I have found the cause of the tests failing. 

I have two disks on my laptop. The spark project dir is on an HDD disk while 
the tempdir created by google.io.Files.createTempDir is the 
/var/folders/5q/ ,which is on the system disk, an SSD.
The ExecutorLoaderSuite test uses 
org.apache.spark.TestUtils.createdCompiledClass methods.
The createCompiledClass method first generates the compiled class in the 
pwd(spark/repl), thens use renameTo to move
the file. The renameTo method fails because the dest file is in a different 
filesystem than the source file.

I modify the TestUtils.scala to first copy the file to dest then delete the 
original file. The tests go smoothly.
Should I issue an jira about this problem? Then I can send a pr on Github.

-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 3:43 AM, Ye Xianjin wrote:

> well. This is very strange. 
> I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and made 
> small changes to ExecutorClassLoaderSuite.scala (mostly output some internal 
> variables). After that, when running repl test, I noticed the ReplSuite  
> was tested first and the test result is ok. But the ExecutorClassLoaderSuite 
> test was weird.
> Here is the output:
> [info] ExecutorClassLoaderSuite:
> [error] Uncaught exception when running 
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
> PermGen space
> [error] Uncaught exception when running 
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
> PermGen space
> Internal error when running tests: java.lang.OutOfMemoryError: PermGen space
> Exception in thread "Thread-3" java.io.EOFException
> at 
> java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
> at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
> at sbt.React.react(ForkTests.scala:116)
> at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
> at java.lang.Thread.run(Thread.java:695)
> 
> 
> I revert my changes. The test result is same.
> 
>  I touched the ReplSuite.scala file (use touch command), the test order is 
> reversed, same as the very beginning. And the output is also the same.(The 
> result in my first post).
> 
> 
> -- 
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> 
> 
> On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:
> 
> > This may have something to do with running the tests on a Mac, as there is
> > a lot of File/URI/URL stuff going on in that test which may just have
> > happened to work if run on a Linux system (like Jenkins). Note that this
> > suite was added relatively recently:
> > https://github.com/apache/spark/pull/217
> > 
> > 
> > On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  > (mailto:advance...@gmail.com)> wrote:
> > 
> > > Thank you for your reply.
> > > 
> > > After building the assembly jar, the repl test still failed. The error
> > > output is same as I post before.
> > > 
> > > --
> > > Ye Xianjin
> > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > 
> > > 
> > > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > > 
> > > > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > > > assembly/assembly".
> > > > 
> > > > Michael
> > > > 
> > > > 
> > > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  > > > (mailto:advance...@gmail.com)(mailto:
> > > advance...@gmail.com (mailto:advance...@gmail.com))> wrote:
> > > > 
> > > > > Hi, everyone:
> > > > > I am new to Spark development. I download spark's latest code from
> > > > > 
> > > > 
> > > > 
> > > 
> > > github.
> > > > > After running sbt/sbt assembly,
> > > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > > 
> > > > 
> > > 
> > > failed
> > > > > running the repl module test.
> > > > > 
> > > > > Here are some output details.
> > > > > 
> > > > > command:
> > > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > > output:
> > > > > 
> > > > > [info] Loading project definition from
> > > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > > [info] Loading project definition from
> > > > > /Volumes/MacintoshHD/github/spark/project
> > > > > [info] Set current project to root (in build
> > > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for graphx/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for bagel/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for streaming/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
well. This is very strange. 
I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and made small 
changes to ExecutorClassLoaderSuite.scala (mostly output some internal 
variables). After that, when running repl test, I noticed the ReplSuite  
was tested first and the test result is ok. But the ExecutorClassLoaderSuite 
test was weird.
Here is the output:
[info] ExecutorClassLoaderSuite:
[error] Uncaught exception when running 
org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
PermGen space
[error] Uncaught exception when running 
org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: 
PermGen space
Internal error when running tests: java.lang.OutOfMemoryError: PermGen space
Exception in thread "Thread-3" java.io.EOFException
at 
java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
at sbt.React.react(ForkTests.scala:116)
at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
at java.lang.Thread.run(Thread.java:695)


I revert my changes. The test result is same.

 I touched the ReplSuite.scala file (use touch command), the test order is 
reversed, same as the very beginning. And the output is also the same.(The 
result in my first post).


-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:

> This may have something to do with running the tests on a Mac, as there is
> a lot of File/URI/URL stuff going on in that test which may just have
> happened to work if run on a Linux system (like Jenkins). Note that this
> suite was added relatively recently:
> https://github.com/apache/spark/pull/217
> 
> 
> On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  (mailto:advance...@gmail.com)> wrote:
> 
> > Thank you for your reply.
> > 
> > After building the assembly jar, the repl test still failed. The error
> > output is same as I post before.
> > 
> > --
> > Ye Xianjin
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > 
> > 
> > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > 
> > > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > > assembly/assembly".
> > > 
> > > Michael
> > > 
> > > 
> > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  > > (mailto:advance...@gmail.com)(mailto:
> > advance...@gmail.com (mailto:advance...@gmail.com))> wrote:
> > > 
> > > > Hi, everyone:
> > > > I am new to Spark development. I download spark's latest code from
> > > > 
> > > 
> > > 
> > 
> > github.
> > > > After running sbt/sbt assembly,
> > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > 
> > > 
> > 
> > failed
> > > > running the repl module test.
> > > > 
> > > > Here are some output details.
> > > > 
> > > > command:
> > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > output:
> > > > 
> > > > [info] Loading project definition from
> > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > [info] Loading project definition from
> > > > /Volumes/MacintoshHD/github/spark/project
> > > > [info] Set current project to root (in build
> > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for graphx/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for bagel/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for streaming/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for mllib/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for catalyst/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for core/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for assembly/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for sql/test:testOnly
> > > > [info] ExecutorClassLoaderSuite:
> > > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > > > SCDynamicStore
> > > > [info] - child first *** FAILED *** (440 milliseconds)
> > > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > [info] at java.lang.ClassLoader.loadClass(Clas

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Aaron Davidson
This may have something to do with running the tests on a Mac, as there is
a lot of File/URI/URL stuff going on in that test which may just have
happened to work if run on a Linux system (like Jenkins). Note that this
suite was added relatively recently:
https://github.com/apache/spark/pull/217


On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin  wrote:

> Thank you for your reply.
>
> After building the assembly jar, the repl test still failed. The error
> output is same as I post before.
>
> --
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>
> On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
>
> > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > assembly/assembly".
> >
> > Michael
> >
> >
> > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  advance...@gmail.com)> wrote:
> >
> > > Hi, everyone:
> > > I am new to Spark development. I download spark's latest code from
> github.
> > > After running sbt/sbt assembly,
> > > I began running sbt/sbt test in the spark source code dir. But it
> failed
> > > running the repl module test.
> > >
> > > Here are some output details.
> > >
> > > command:
> > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > output:
> > >
> > > [info] Loading project definition from
> > > /Volumes/MacintoshHD/github/spark/project/project
> > > [info] Loading project definition from
> > > /Volumes/MacintoshHD/github/spark/project
> > > [info] Set current project to root (in build
> > > file:/Volumes/MacintoshHD/github/spark/)
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for graphx/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for bagel/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for streaming/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for mllib/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for catalyst/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for core/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for assembly/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for sql/test:testOnly
> > > [info] ExecutorClassLoaderSuite:
> > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > > SCDynamicStore
> > > [info] - child first *** FAILED *** (440 milliseconds)
> > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > [info] at
> > >
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > [info] at
> > >
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > [info] at scala.Option.getOrElse(Option.scala:120)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Ye Xianjin
Thank you for your reply. 

After building the assembly jar, the repl test still failed. The error output 
is same as I post before. 

-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:

> I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> assembly/assembly".
> 
> Michael
> 
> 
> On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  (mailto:advance...@gmail.com)> wrote:
> 
> > Hi, everyone:
> > I am new to Spark development. I download spark's latest code from github.
> > After running sbt/sbt assembly,
> > I began running sbt/sbt test in the spark source code dir. But it failed
> > running the repl module test.
> > 
> > Here are some output details.
> > 
> > command:
> > sbt/sbt "test-only org.apache.spark.repl.*"
> > output:
> > 
> > [info] Loading project definition from
> > /Volumes/MacintoshHD/github/spark/project/project
> > [info] Loading project definition from
> > /Volumes/MacintoshHD/github/spark/project
> > [info] Set current project to root (in build
> > file:/Volumes/MacintoshHD/github/spark/)
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for graphx/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for bagel/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for streaming/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for mllib/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for catalyst/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for core/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for assembly/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for sql/test:testOnly
> > [info] ExecutorClassLoaderSuite:
> > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > SCDynamicStore
> > [info] - child first *** FAILED *** (440 milliseconds)
> > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > [info] at
> > org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > [info] at
> > org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > [info] at scala.Option.getOrElse(Option.scala:120)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > [info] at
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> > [info] at
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> > [info] at scala.collection.immutable.List.foreach(List.scala:318)
> > [info] at org.scalatest.SuperEngine.org 
> > (http://org.scalatest.SuperEngine.org)
> > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> > [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)

Re: Tests failed after assembling the latest code from github

2014-04-14 Thread Michael Armbrust
I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
assembly/assembly".

Michael


On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin  wrote:

> Hi, everyone:
> I am new to Spark development. I download spark's latest code from github.
> After running sbt/sbt assembly,
> I began running  sbt/sbt test in the spark source code dir. But it failed
> running the repl module test.
>
> Here are some output details.
>
> command:
> sbt/sbt "test-only org.apache.spark.repl.*"
> output:
>
> [info] Loading project definition from
> /Volumes/MacintoshHD/github/spark/project/project
> [info] Loading project definition from
> /Volumes/MacintoshHD/github/spark/project
> [info] Set current project to root (in build
> file:/Volumes/MacintoshHD/github/spark/)
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for graphx/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for bagel/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for streaming/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for mllib/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for catalyst/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for core/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for assembly/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for sql/test:testOnly
> [info] ExecutorClassLoaderSuite:
> 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> SCDynamicStore
> [info] - child first *** FAILED *** (440 milliseconds)
> [info]   java.lang.ClassNotFoundException: ReplFakeClass2
> [info]   at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> [info]   at
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> [info]   at
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> [info]   at scala.Option.getOrElse(Option.scala:120)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> [info]   at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> [info]   at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> [info]   at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> [info]   at
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> [info]   at scala.collection.immutable.List.foreach(List.scala:318)
> [info]   at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> [info]   at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> [info]   at org.scalatest.Suite$class.run(Suite.scala:2303)
> [info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.org
> $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> [info]   at
> org.