I made the change so that I could implement top() using treeReduce(). A member 
on here suggested I make the change in RDD.scala to accomplish that. Also, this 
is for a research project, and not for commercial use. 

So, any advice on how I can get the spark submit to use my custom built jars 
would be very useful.

Thanks,
Raghav

> On Jun 16, 2015, at 6:57 PM, Will Briggs <wrbri...@gmail.com> wrote:
> 
> In general, you should avoid making direct changes to the Spark source code. 
> If you are using Scala, you can seamlessly blend your own methods on top of 
> the base RDDs using implicit conversions.
> 
> Regards,
> Will
> 
> On June 16, 2015, at 7:53 PM, raggy <raghav0110...@gmail.com> wrote:
> 
> I am trying to submit a spark application using the command line. I used the
> spark submit command for doing so. I initially setup my Spark application on
> Eclipse and have been making changes on there. I recently obtained my own
> version of the Spark source code and added a new method to RDD.scala. I
> created a new spark core jar using mvn, and I added it to my eclipse build
> path. My application ran perfectly fine. 
> 
> Now, I would like to submit it through the command line. I submitted my
> application like this:
> 
> bin/spark-submit --master local[2] --class "SimpleApp"
> /Users/XXX/Desktop/spark2.jar
> 
> The spark-submit command is within the spark project that I modified by
> adding new methods.
> When I do so, I get this error:
> 
> java.lang.NoSuchMethodError:
> org.apache.spark.rdd.RDD.treeTop(ILscala/math/Ordering;)Ljava/lang/Object;
>       at SimpleApp$.main(SimpleApp.scala:12)
>       at SimpleApp.main(SimpleApp.scala)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
>       at 
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
>       at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
>       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
>       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 
> When I use spark submit, where does the jar come from? How do I make sure it
> uses the jars that have built? 
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Submitting-Spark-Applications-using-Spark-Submit-tp23352.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to