Livy's programmatic API (the Java/Scala APIs) uses Interactive Sessions not
Batch Sessions. When you submit a batch job to Livy it spins up a Spark App
and runs your job (jar) then shuts down. In an interactive session (like
the client) a Spark App is spun up and you then submit jobs to it until y
I agree it's not that intuitive, most users originally used Livy bundled
with Hadoop (like HDP) and wouldn't see this issue. You should open a JIRA
for this since more users like you aren't using LIvy with HDFS.
You have to call "client.stop(true)" if you want to shut down the
Spark application.
On Sun, Oct 29, 2017 at 3:19 AM, Stefan Miklosovic wrote:
> Title says it all, I upload a JAR, I run a job via client.run(Job
> job).get(); and I do get a result - all is computed ok, however, that
> application