I have set it to /root/.livy-sessions and it works.

I would make this default. It is very counter intuitive to set it myself -
that should work out of the box.

On Sun, Oct 29, 2017 at 10:26 AM, Stefan Miklosovic <mikloso...@gmail.com>
wrote:

> It does not work. I have set that property to some directory where that
> jar is placed and all containers see it - it still says that
>
> Caused by: java.io.IOException: Bad Request: "requirement failed: Local
> path /root/.livy-sessions/1ba0a04e-9c72-4ba7-82dc-90334ffec877/pijob.jar
> cannot be added to user sessions."
>
> Should I set that property to /root/.livy ? That seems to be very awkward
> to me.
>
> On Thu, Oct 26, 2017 at 10:09 PM, Alex Bozarth <ajboz...@us.ibm.com>
> wrote:
>
>> If you are going to access jars outside HDFS you must have their path
>> listed in livy.file.local-dir-whitelist like you found.
>>
>>
>> *Alex Bozarth*
>> Software Engineer
>> Spark Technology Center
>> ------------------------------
>> *E-mail:* *ajboz...@us.ibm.com* <ajboz...@us.ibm.com>
>> *GitHub: **github.com/ajbozarth* <https://github.com/ajbozarth>
>>
>>
>> 505 Howard Street
>> <https://maps.google.com/?q=505+Howard+Street+San+Francisco,+CA+94105+United+States&entry=gmail&source=g>
>> San Francisco, CA 94105
>> <https://maps.google.com/?q=505+Howard+Street+San+Francisco,+CA+94105+United+States&entry=gmail&source=g>
>> United States
>> <https://maps.google.com/?q=505+Howard+Street+San+Francisco,+CA+94105+United+States&entry=gmail&source=g>
>>
>>
>>
>> [image: Inactive hide details for Stefan Miklosovic ---10/26/2017
>> 09:20:41 AM---Ok I am getting somewhere: @RunWith(JUnit4.class)]Stefan
>> Miklosovic ---10/26/2017 09:20:41 AM---Ok I am getting somewhere:
>> @RunWith(JUnit4.class)
>>
>> From: Stefan Miklosovic <mikloso...@gmail.com>
>> To: user@livy.incubator.apache.org
>> Date: 10/26/2017 09:20 AM
>> Subject: Re: ClassNotFoundException on job submit
>> ------------------------------
>>
>>
>>
>> Ok I am getting somewhere:
>>
>> @RunWith(JUnit4.class)
>> public class LivyTestCase {
>>
>>    private static final int SAMPLES = 10000;
>>
>>    private static final String LIVY_URI = "https://urldefense.proofpoint
>> .com/v2/url?u=http-3A__spark-2Dmaster-3A8998&d=DwIBaQ&c=jf_
>> iaSHvJObTbx-siA1ZOg&r=S1_S7Dymu4ZL6g7L21O78VQZ53vEnAyZ-cx37D
>> PYDyo&m=m06yDfiPVPYT_h4awyPxVA7SKfA9V1MlTzp3mQ8REcE&s=v2g-MD
>> 9i6gEg80s1fyGAzLe9WCOhelMfEygM1WOwNeM&e=";
>>
>>    @Rule
>>    public TemporaryFolder jarFolder = new TemporaryFolder();
>>
>>    @Test
>>    public void testPiJob() throws Exception {
>>
>>        File jarFile = jarFolder.newFile("testpijob2.jar");
>>
>>        ShrinkWrap.create(JavaArchive.class)
>>            .addClass(PiJob.class)
>>            .as(ZipExporter.class)
>>            .exportTo(jarFile, true);
>>
>>        LivyClient client = new LivyClientBuilder()
>>            .setURI(new URI(LIVY_URI))
>>            .build();
>>
>>        System.out.println("Uploading PiJob jar");
>>
>>        client.uploadJar(jarFile).get();
>>
>>        System.out.println("PiJob jar uploaded");
>>
>>        final Double result = client.submit(new PiJob(1000)).get();
>>
>>        System.out.println(result);
>>    }
>> }
>>
>> But while doing so, it gives me:
>>
>> java.util.concurrent.ExecutionException: java.io.IOException:
>> Bad Request: "requirement failed: Local path
>> /root/.livy-sessions/c7dbb697-13ed-443f-a630-bc9d9a544f6b/testpijob2.jar
>> cannot be added to user sessions."
>>
>> at java.util.concurrent.FutureTask.report(FutureTask.java:122)
>> at java.util.concurrent.FutureTask.get(FutureTask.java:192)
>> at LivyTestCase.testPiJob(LivyTestCase.java:46)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>> ssorImpl.java:62)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>> thodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(
>> FrameworkMethod.java:50)
>> at org.junit.internal.runners.model.ReflectiveCallable.run(Refl
>> ectiveCallable.java:12)
>> at org.junit.runners.model.FrameworkMethod.invokeExplosively(Fr
>> ameworkMethod.java:47)
>> at org.junit.internal.runners.statements.InvokeMethod.evaluate(
>> InvokeMethod.java:17)
>> at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48)
>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
>> at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit
>> 4ClassRunner.java:78)
>> at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit
>> 4ClassRunner.java:57)
>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
>> at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
>> at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
>> at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
>> at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
>> at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
>> at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
>> at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs
>> (JUnit4IdeaTestRunner.java:68)
>> at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.star
>> tRunnerWithArgs(IdeaTestRunner.java:47)
>> at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsA
>> ndStart(JUnitStarter.java:242)
>> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStart
>> er.java:70)
>> Caused by: java.io.IOException: Bad Request: "requirement failed:
>> Local path /root/.livy-sessions/c7dbb697-13ed-443f-a630-bc9d9a544f6b/te
>> stpijob2.jar
>> cannot be added to user sessions."
>> at org.apache.livy.client.http.LivyConnection.sendRequest(LivyC
>> onnection.java:229)
>> at org.apache.livy.client.http.LivyConnection.post(LivyConnecti
>> on.java:192)
>> at org.apache.livy.client.http.HttpClient$2.call(HttpClient.java:152)
>> at org.apache.livy.client.http.HttpClient$2.call(HttpClient.java:149)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFu
>> tureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFu
>> tureTask.run(ScheduledThreadPoolExecutor.java:293)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>> Executor.java:1149)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>> lExecutor.java:624)
>> at java.lang.Thread.run(Thread.java:748)
>>
>> I have checked that there is nothing else but that PiJob.class in that
>> JAR. Nothing else at all.
>>
>> I have a feeling that the path for that jar needs to be visible to all
>> slaves, I do not have HDFS, I have spark slaves in Docker containers
>> so once I want to upload it, I see it in spark master
>> /root/.livy-sessions, but it is not in containers ...
>>
>> Could not this be helpful?
>>
>> # List of local directories from where files are allowed to be added
>> to user sessions. By
>> # default it's empty, meaning users can only reference remote URIs
>> when starting their
>> # sessions.
>> # livy.file.local-dir-whitelist =
>>
>> On Thu, Oct 26, 2017 at 5:30 PM, Stefan Miklosovic <mikloso...@gmail.com>
>> wrote:
>> > I think I have to add a jar with PiJob on the classpath of Livy so it
>> > knows how to deserialize it .... hmmmmm
>> >
>> > On Thu, Oct 26, 2017 at 5:24 PM, Stefan Miklosovic <
>> mikloso...@gmail.com> wrote:
>> >> I have did it as you suggested and it seems to start the jobs OK and I
>> >> see the sessions in UI but while it is being computed (I see the job
>> >> is distributed on two spark slaves where in front of that there is
>> >> spark-master), I am computing this from my localhost:
>> >>
>> >> @RunWith(JUnit4.class)
>> >> public class LivyTestCase {
>> >>
>> >>     private static final int SAMPLES = 10000;
>> >>
>> >>     private static final String LIVY_URI = "
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__spark-
>> 2Dmaster-3A8998&d=DwIBaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=S1_S7Dym
>> u4ZL6g7L21O78VQZ53vEnAyZ-cx37DPYDyo&m=m06yDfiPVPYT_h4awyPxVA
>> 7SKfA9V1MlTzp3mQ8REcE&s=v2g-MD9i6gEg80s1fyGAzLe9WCOhelMfEygM1WOwNeM&e=";
>>
>> >>
>> >>     @Test
>> >>     public void testPiJob() throws Exception {
>> >>
>> >>         LivyClient client = new LivyClientBuilder()
>> >>             .setURI(new URI(LIVY_URI))
>> >>             .build();
>> >>
>> >>         final Double result = client.submit(new PiJob(1000)).get();
>> >>
>> >>         System.out.println(result);
>> >>     }
>> >> }
>> >>
>> >> It is a PiJob from sites examples.
>> >>
>> >> Now what I see in Livy logs is this
>> >>
>> >> org.apache.livy.shaded.kryo.kryo.KryoException: Unable to find class:
>> PiJob
>> >> at org.apache.livy.shaded.kryo.kryo.util.DefaultClassResolver.
>> readName(DefaultClassResolver.java:138)
>> >> at org.apache.livy.shaded.kryo.kryo.util.DefaultClassResolver.r
>> eadClass(DefaultClassResolver.java:115)
>> >> at org.apache.livy.shaded.kryo.kryo.Kryo.readClass(Kryo.java:656)
>> >> at org.apache.livy.shaded.kryo.kryo.Kryo.readClassAndObject(Kry
>> o.java:767)
>> >> at org.apache.livy.client.common.Serializer.deserialize(Seriali
>> zer.java:63)
>> >> at org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:39)
>> >> at org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:27)
>> >> at org.apache.livy.rsc.driver.JobWrapper.call(JobWrapper.java:57)
>> >> at org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWr
>> apper.java:42)
>> >> at org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWr
>> apper.java:27)
>> >> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> >> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
>> Executor.java:1149)
>> >> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo
>> lExecutor.java:624)
>> >> at java.lang.Thread.run(Thread.java:748)
>> >> Caused by: java.lang.ClassNotFoundException: PiJob
>> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> >> at java.lang.Class.forName0(Native Method)
>> >> at java.lang.Class.forName(Class.java:348)
>> >> at org.apache.livy.shaded.kryo.kryo.util.DefaultClassResolver.
>> readName(DefaultClassResolver.java:136)
>> >> ... 13 more
>> >>
>> >> I know to read it - I understand there is not PiJob on the class path,
>> >> but why? I have just sent that class to Livy ...
>> >>
>> >> On Thu, Oct 26, 2017 at 4:17 PM, Saisai Shao <sai.sai.s...@gmail.com>
>> wrote:
>> >>> You can choose to set "livy.spark.master" to "local" and
>> >>> "livy.spark.deploy-mode" to "client" to start Spark with local mode,
>> in such
>> >>> case YARN is not required.
>> >>>
>> >>> Otherwise if you plan to run on YARN, you have to install Hadoop and
>> >>> configure HADOOP_CONF_DIR in livy-env.sh.
>> >>>
>> >>> On Thu, Oct 26, 2017 at 9:40 PM, Stefan Miklosovic <
>> mikloso...@gmail.com>
>> >>> wrote:
>> >>>>
>> >>>> Hi,
>> >>>>
>> >>>> I am running Livy server in connection with Spark without Hadoop. I
>> am
>> >>>> setting only SPARK_HOME and I am getting this in Livy UI logs after
>> >>>> job submission.
>> >>>>
>> >>>> I am using pretty much standard configuration but
>> >>>> livy.spark.deploy-mode = cluster
>> >>>>
>> >>>> Do I need to run with Hadoop installation as well and specify
>> >>>> HADOOP_CONF_DIR?
>> >>>>
>> >>>> Is not it possible to run Livy with "plain" Spark without YARN?
>> >>>>
>> >>>> stderr:
>> >>>> java.lang.ClassNotFoundException:
>> >>>> at java.lang.Class.forName0(Native Method)
>> >>>> at java.lang.Class.forName(Class.java:348)
>> >>>> at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
>> >>>> at
>> >>>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
>> $SparkSubmit$$runMain(SparkSubmit.scala:712)
>> >>>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit
>> .scala:180)
>> >>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.
>> scala:205)
>> >>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
>> >>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> >>>>
>> >>>> Thanks!
>> >>>>
>> >>>> --
>> >>>> Stefan Miklosovic
>> >>>
>> >>>
>> >>
>> >>
>> >>
>> >> --
>> >> Stefan Miklosovic
>> >
>> >
>> >
>> > --
>> > Stefan Miklosovic
>>
>>
>>
>> --
>> Stefan Miklosovic
>>
>>
>>
>>
>>
>
>
> --
> Stefan Miklosovic
>



-- 
Stefan Miklosovic

Reply via email to