Andy, This has nothing to do with Spark but I guess you don't have the proper Scala version. The version you're currently running doesn't recognize a method in Scala ArrayOps, namely: scala.collection.mutable.ArrayOps.$colon$plus
On Monday, January 18, 2016 7:53 PM, Andy Davidson <a...@santacruzintegration.com> wrote: Many thanks. I was using a different scala plug in. this one seems to work better I no longer get compile error how ever I get the following stack trace when I try to run my unit tests with mllib open I am still using eclipse luna. Andy java.lang.NoSuchMethodError: scala.collection.mutable.ArrayOps.$colon$plus(Ljava/lang/Object;Lscala/reflect/ClassTag;)Ljava/lang/Object; at org.apache.spark.ml.util.SchemaUtils$.appendColumn(SchemaUtils.scala:73) at org.apache.spark.ml.feature.HashingTF.transformSchema(HashingTF.scala:76) at org.apache.spark.ml.feature.HashingTF.transform(HashingTF.scala:64) at com.pws.fantasySport.ml.TDIDFTest.runPipleLineTF_IDF(TDIDFTest.java:52) at com.pws.fantasySport.ml.TDIDFTest.test(TDIDFTest.java:36) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192) From: Jakob Odersky <joder...@gmail.com> Date: Monday, January 18, 2016 at 3:20 PM To: Andrew Davidson <a...@santacruzintegration.com> Cc: "user @spark" <user@spark.apache.org> Subject: Re: trouble using eclipse to view spark source code Have you followed the guide on how to import spark into eclipse https://cwiki.apache.org/confluence/display/SPARK/Useful+Developer+Tools#UsefulDeveloperTools-Eclipse ? On 18 January 2016 at 13:04, Andy Davidson <a...@santacruzintegration.com> wrote: Hi My project is implemented using Java 8 and Python. Some times its handy to look at the spark source code. For unknown reason if I open a spark project my java projects show tons of compiler errors. I think it may have something to do with Scala. If I close the projects my java code is fine. I typically I only want to import the machine learning and streaming projects. I am not sure if this is an issue or not but my java projects are built using gradel In eclipse preferences -> scala -> installations I selected Scala: 2.10.6 (built in) Any suggestions would be greatly appreciate Andy --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
--------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org