Thanks for reporting this. One thing to try is to just do a git clean to make sure you have a totally clean working space ("git clean -fdx" will blow away any differences you have from the repo, of course only do that if you don't have other files around). Can you reproduce this if you just run "sbt/sbt compile"? Also, if you can, can you reproduce it if you checkout only the spark master branch and not merged with your own code? Finally, if you can reproduce it on master, can you perform a bisection to find out which commit caused it?
- Patrick On Sat, Nov 29, 2014 at 10:29 PM, Ganelin, Ilya <ilya.gane...@capitalone.com> wrote: > Hi all - I've just merged in the latest changes from the Spark master branch > to my local branch. I am able to build just fine with > mvm clean package > However, when I attempt to run dev/run-tests, I get the following error: > > Using /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home as > default JAVA_HOME. > Note, this will be overridden by -java-home if it is set. > Error: Invalid or corrupt jarfile sbt/sbt-launch-0.13.6.jar > [error] Got a return code of 1 on line 163 of the run-tests script. > > With an individual test I get the same error. I have tried downloading a new > copy of SBT 0.13.6 but it has not helped. Does anyone have any suggestions > for getting this running? Things worked fine before updating Spark. > ________________________________________________________ > > The information contained in this e-mail is confidential and/or proprietary > to Capital One and/or its affiliates. The information transmitted herewith is > intended only for use by the individual or entity to which it is addressed. > If the reader of this message is not the intended recipient, you are hereby > notified that any review, retransmission, dissemination, distribution, > copying or other use of, or taking of any action in reliance upon this > information is strictly prohibited. If you have received this communication > in error, please contact the sender and delete the material from your > computer. --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org