ator.apache.org
Date: 10/29/2017 03:06 AM
Subject:Re: ClassNotFoundException on job submit
I have set it to /root/.livy-sessions and it works.
I would make this default. It is very counter intuitive to set it myself -
that should work out of the box.
On Sun, Oct 29, 2017 at 10:26 AM, St
unWith(JUnit4.class)]Stefan
>> Miklosovic ---10/26/2017 09:20:41 AM---Ok I am getting somewhere:
>> @RunWith(JUnit4.class)
>>
>> From: Stefan Miklosovic
>> To: user@livy.incubator.apache.org
>> Date: 10/26/2017 09:20 AM
>> Subject: Re: ClassNotFoundException
m getting somewhere:
> @RunWith(JUnit4.class)
>
> From: Stefan Miklosovic
> To: user@livy.incubator.apache.org
> Date: 10/26/2017 09:20 AM
> Subject: Re: ClassNotFoundException on job submit
> --
>
>
>
> Ok I am getting somewh
United
States
From: Stefan Miklosovic
To: user@livy.incubator.apache.org
Date: 10/26/2017 09:20 AM
Subject:Re: ClassNotFoundException on job submit
Ok I am getting somewhere:
@RunWith(JUnit4.class)
public class LivyTestCase {
private static final int SAMPLES = 1;
private static final String LIVY_URI = "http://spark-master:8998";;
@Rule
public TemporaryFolder jarFolder = new TemporaryFolder();
@Test
public void
I think I have to add a jar with PiJob on the classpath of Livy so it
knows how to deserialize it hm
On Thu, Oct 26, 2017 at 5:24 PM, Stefan Miklosovic wrote:
> I have did it as you suggested and it seems to start the jobs OK and I
> see the sessions in UI but while it is being computed
I have did it as you suggested and it seems to start the jobs OK and I
see the sessions in UI but while it is being computed (I see the job
is distributed on two spark slaves where in front of that there is
spark-master), I am computing this from my localhost:
@RunWith(JUnit4.class)
public class L
You can choose to set "livy.spark.master" to "local" and
"livy.spark.deploy-mode" to "client" to start Spark with local mode, in
such case YARN is not required.
Otherwise if you plan to run on YARN, you have to install Hadoop and
configure HADOOP_CONF_DIR in livy-env.sh.
On Thu, Oct 26, 2017 at 9
Hi,
I am running Livy server in connection with Spark without Hadoop. I am
setting only SPARK_HOME and I am getting this in Livy UI logs after
job submission.
I am using pretty much standard configuration but
livy.spark.deploy-mode = cluster
Do I need to run with Hadoop installation as well and