By all means, it would be greatly appreciated!
On Mon, Apr 14, 2014 at 10:34 PM, Ye Xianjin advance...@gmail.com wrote:
Hi, I think I have found the cause of the tests failing.
I have two disks on my laptop. The spark project dir is on an HDD disk
while the tempdir created by google.io.Files.createTempDir is the
/var/folders/5q/ ,which is on the system disk, an SSD.
The ExecutorLoaderSuite test uses
org.apache.spark.TestUtils.createdCompiledClass methods.
The createCompiledClass method first generates the compiled class in the
pwd(spark/repl), thens use renameTo to move
the file. The renameTo method fails because the dest file is in a
different filesystem than the source file.
I modify the TestUtils.scala to first copy the file to dest then delete
the original file. The tests go smoothly.
Should I issue an jira about this problem? Then I can send a pr on Github.
--
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
On Tuesday, April 15, 2014 at 3:43 AM, Ye Xianjin wrote:
well. This is very strange.
I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and
made small changes to ExecutorClassLoaderSuite.scala (mostly output some
internal variables). After that, when running repl test, I noticed the
ReplSuite
was tested first and the test result is ok. But the
ExecutorClassLoaderSuite test was weird.
Here is the output:
[info] ExecutorClassLoaderSuite:
[error] Uncaught exception when running
org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
PermGen space
[error] Uncaught exception when running
org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
PermGen space
Internal error when running tests: java.lang.OutOfMemoryError: PermGen
space
Exception in thread Thread-3 java.io.EOFException
at
java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
at sbt.React.react(ForkTests.scala:116)
at
sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
at java.lang.Thread.run(Thread.java:695)
I revert my changes. The test result is same.
I touched the ReplSuite.scala file (use touch command), the test order
is reversed, same as the very beginning. And the output is also the
same.(The result in my first post).
--
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:
This may have something to do with running the tests on a Mac, as
there is
a lot of File/URI/URL stuff going on in that test which may just have
happened to work if run on a Linux system (like Jenkins). Note that
this
suite was added relatively recently:
https://github.com/apache/spark/pull/217
On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin advance...@gmail.com(mailto:
advance...@gmail.com) wrote:
Thank you for your reply.
After building the assembly jar, the repl test still failed. The
error
output is same as I post before.
--
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
I believe you may need an assembly jar to run the ReplSuite.
sbt/sbt
assembly/assembly.
Michael
On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin
advance...@gmail.com(mailto:
advance...@gmail.com)(mailto:
advance...@gmail.com (mailto:advance...@gmail.com)) wrote:
Hi, everyone:
I am new to Spark development. I download spark's latest code
from
github.
After running sbt/sbt assembly,
I began running sbt/sbt test in the spark source code dir. But it
failed
running the repl module test.
Here are some output details.
command:
sbt/sbt test-only org.apache.spark.repl.*
output:
[info] Loading project definition from
/Volumes/MacintoshHD/github/spark/project/project
[info] Loading project definition from
/Volumes/MacintoshHD/github/spark/project
[info] Set current project to root (in build
file:/Volumes/MacintoshHD/github/spark/)
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for graphx/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for bagel/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for streaming/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for mllib/test:testOnly