You should compile and package PiJar before running this code snippet. It does not need to be a separate app/project. You can put the PiJob code right next to the code snippet to run it. MVN/sbt/gradle can create the jar for you and I assume there is a way to call them programmatically, but that is not needed. You can use the path to the jar file as piJar.
I hope this answers your question. Thanks, Meisam import org.apache.spark.api.java.function.*; import org.apache.livy.*; public class PiJob implements Job<Double>, Function<Integer, Integer>, Function2<Integer, Integer, Integer> { private final int samples; public PiJob(int samples) { this.samples = samples; } @Override public Double call(JobContext ctx) throws Exception { List<Integer> sampleList = new ArrayList<Integer>(); for (int i = 0; i < samples; i++) { sampleList.add(i + 1); } return 4.0d * ctx.sc().parallelize(sampleList).map(this).reduce(this) / samples; } @Override public Integer call(Integer v1) { double x = Math.random(); double y = Math.random(); return (x*x + y*y < 1) ? 1 : 0; } @Override public Integer call(Integer v1, Integer v2) { return v1 + v2; } } On Fri, Dec 1, 2017 at 1:09 AM kant kodali <kanth...@gmail.com> wrote: > Hi All, > > I am looking at the following snippet of code and I wonder where and how > do I create piJar ? can I create programmatically if so how? is there a > complete hello world example somewhere where I can follow steps and see how > this works? > > Concerning line > > client.uploadJar(new File(piJar)).get(); > > > > Code snippet > > LivyClient client = new LivyClientBuilder() > .setURI(new URI(livyUrl)) > .build(); > try { > System.err.printf("Uploading %s to the Spark context...\n", piJar); > client.uploadJar(new File(piJar)).get(); > > System.err.printf("Running PiJob with %d samples...\n", samples); > double pi = client.submit(new PiJob(samples)).get(); > > System.out.println("Pi is roughly: " + pi); > } finally { > client.stop(true); > } > >