Hey Jordi,

"No FileSystem for scheme: http" means that your YARN NMs aren't configured
with the http filesystem in your core-site.xml:

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><configuration>
    <property>
      <name>fs.http.impl</name>
      <value>org.apache.samza.util.hadoop.HttpFileSystem</value>
    </property></configuration>


As documented here:

  http://samza.apache.org/learn/tutorials/0.8/run-in-multi-node-yarn.html

Cheers,
Chris

On Fri, Mar 27, 2015 at 2:24 AM, Jordi Blasi Uribarri <jbl...@nextel.es>
wrote:

> I thought it was working, but not really. The job runs and I can see it on
> the web admin. But when it has to process a message it fails, It goes down
> and I get this exception:
>
> Application application_1427403490569_0002 failed 2 times due to AM
> Container for appattempt_1427403490569_0002_000002 exited with exitCode:
> -1000
> For more detailed output, check application tracking page:
> http://samza01:8088/proxy/application_1427403490569_0002/Then, click on
> links to logs of each attempt.
> Diagnostics: No FileSystem for scheme: http
> java.io.IOException: No FileSystem for scheme: http
> at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
> at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:249)
> at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:61)
> at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:359)
> at org.apache.hadoop.yarn.util.FSDownload$2.run(FSDownload.java:357)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:356)
> at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:60)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Failing this attempt. Failing the application.
>
> What I am doing wrong?
>
> Thanks,
>
>         Jordi
>
> -----Mensaje original-----
> De: Jordi Blasi Uribarri [mailto:jbl...@nextel.es]
> Enviado el: viernes, 27 de marzo de 2015 9:45
> Para: dev@samza.apache.org
> Asunto: RE: java.lang.NoClassDefFoundError on Yarn job
>
> Solved. My aplication was using 0.9.0 version of yarn. When downgraded to
> 0.8.0 it worked.
>
> Thanks,
>
>  Jordi
>
> -----Mensaje original-----
> De: Jordi Blasi Uribarri [mailto:jbl...@nextel.es] Enviado el: viernes,
> 27 de marzo de 2015 9:05
> Para: dev@samza.apache.org
> Asunto: RE: java.lang.NoClassDefFoundError on Yarn job
>
> I did the steps that were included in the case and I am getting the same
> error.
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/conf/Configuration
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:191)
>         at org.apache.samza.job.JobRunner.run(JobRunner.scala:56)
>         at org.apache.samza.job.JobRunner$.main(JobRunner.scala:37)
>         at org.apache.samza.job.JobRunner.main(JobRunner.scala)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.conf.Configuration
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>         ... 5 more
>
> The only difference I see is that I downloaded hadoop 2.6 instead of 2.4.
> Is the version mandatory? Should I downgrade?
>
> Thanks,
>
>         Jordi
>
> -----Mensaje original-----
> De: Roger Hoover [mailto:roger.hoo...@gmail.com] Enviado el: jueves, 26
> de marzo de 2015 17:25
> Para: dev@samza.apache.org
> Asunto: Re: java.lang.NoClassDefFoundError on Yarn job
>
> Hi Jordi,
>
> You might be running into this issue (
> https://issues.apache.org/jira/browse/SAMZA-456) which I just hit as well.
> You probably need to add a couple more jars to your YARN lib dir.
>
> Cheers,
>
> Roger
>
> On Thu, Mar 26, 2015 at 9:21 AM, Jordi Blasi Uribarri <jbl...@nextel.es>
> wrote:
>
> > Hi:
> >
> > I got samza running a job in local mode with the property:
> > job.factory.class=org.apache.samza.job.local.ThreadJobFactory
> >
> > Now I am trying to get it running in multiple machines. I have
> > followed the steps in the following guide:
> >
> >
> > https://github.com/apache/samza/blob/master/docs/learn/tutorials/versi
> > oned/run-in-multi-node-yarn.md
> >
> > I see the node up and running.
> >
> > I have created a tar.gz file with the contents of the bin and lib
> > folders that were running locally Yarn and published it in a local
> > Apache2 web server. The properties file looks like this:
> >
> > task.class=samzafroga.job1
> > job.name=samzafroga.job1
> > job.factory.class=org.apache.samza.job.yarn.YarnJobFactory
> > yarn.package.path= http://192.168.15.92/jobs/samzajob1.tar.gz
> >
> >
> > systems.kafka.samza.factory=org.apache.samza.system.kafka.KafkaSystemF
> > actory systems.kafka.consumer.zookeeper.connect= broker01:2181
> > systems.kafka.producer.bootstrap.servers= broker01:9092
> >
> > task.inputs=kafka.syslog
> >
> > serializers.registry.json.class=org.apache.samza.serializers.JsonSerde
> > Factory
> >
> > serializers.registry.string.class=org.apache.samza.serializers.StringS
> > erdeFactory systems.kafka.streams.syslog.samza.msg.serde=string
> > systems.kafka.streams.samzaout.samza.msg.serde=string
> >
> > When I run the same command that was working in the local mode:
> > bin/run-job.sh
> > --config-factory=org.apache.samza.config.factories.PropertiesConfigFac
> > tory --config-path=file://$PWD/job1.properties
> >
> > I see the following exception:
> > java version "1.7.0_75"
> > OpenJDK Runtime Environment (IcedTea 2.5.4) (7u75-2.5.4-2) OpenJDK
> > 64-Bit Server VM (build 24.75-b04, mixed mode)
> > /usr/lib/jvm/java-7-openjdk-amd64/bin/java
> > -Dlog4j.configuration=file:bin/log4j-console.xml
> > -Dsamza.log.dir=/opt/jobs -Djava.io.tmpdir=/opt/jobs/tmp -Xmx768M
> > -XX:+PrintGCDateStamps -Xloggc:/opt/jobs/gc.log
> > -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10
> > -XX:GCLogFileSize=10241024 -d64 -cp
> > /opt/hadoop/conf:/opt/jobs/lib/samzafroga-0.0.1-jar-with-dependencies.
> > jar
> > org.apache.samza.job.JobRunner
> > --config-factory=org.apache.samza.config.factories.PropertiesConfigFac
> > tory --config-path=file:///opt/jobs/job1.properties
> > log4j: reset attribute= "false".
> > log4j: Threshold ="null".
> > log4j: Level value for root is  [INFO].
> > log4j: root level set to INFO
> > log4j: Class name: [org.apache.log4j.ConsoleAppender]
> > log4j: Parsing layout of class: "org.apache.log4j.PatternLayout"
> > log4j: Setting property [conversionPattern] to [%d{dd MMM yyyy
> > HH:mm:ss} %5p %c{1} - %m%n].
> > log4j: Adding appender named [consoleAppender] to category [root].
> > log4j: Class name: [org.apache.log4j.RollingFileAppender]
> > log4j: Setting property [append] to [false].
> > log4j: Setting property [file] to [out/learning.log].
> > log4j: Parsing layout of class: "org.apache.log4j.PatternLayout"
> > log4j: Setting property [conversionPattern] to [%d{ABSOLUTE} %-5p
> > [%c{1}] %m%n].
> > log4j: setFile called: out/learning.log, false
> > log4j: setFile ended
> > log4j: Adding appender named [fileAppender] to category [root].
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/apache/hadoop/conf/Configuration
> >         at java.lang.Class.forName0(Native Method)
> >         at java.lang.Class.forName(Class.java:191)
> >         at org.apache.samza.job.JobRunner.run(JobRunner.scala:56)
> >         at org.apache.samza.job.JobRunner$.main(JobRunner.scala:37)
> >         at org.apache.samza.job.JobRunner.main(JobRunner.scala)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.conf.Configuration
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >         ... 5 more
> >
> > I guess there is a problem with the job package but I am not sure how
> > to solve it.
> >
> > Thanks,
> >
> >                 Jordi
> > ________________________________
> > Jordi Blasi Uribarri
> > Área I+D+i
> >
> > jbl...@nextel.es
> > Oficina Bilbao
> >
> > [http://www.nextel.es/wp-content/uploads/Firma_Nextel_2014.png]
> >
>

Reply via email to