Hi Sean,

We are using CM, and Hue, Hive etc all work, but for some reason I can't
get the CP correct for this job which I submit using:

java -cp
:$HADOOP_HOME/hdfs/hadoop-hdfs-2.5.0-cdh5.2.0.jar:./:target/classes:target/cube-0.17-SNAPSHOT-jar-with-dependencies.jar
org.gbif.metrics.cube.occurrence.backfill.Backfill

Thanks,
Tim


On Wed, Nov 5, 2014 at 4:30 PM, Sean Busbey <bus...@cloudera.com> wrote:

> How are you submitting the job?
>
> How are your cluster configuration files deployed (i.e. are you using CM)?
>
> On Wed, Nov 5, 2014 at 8:50 AM, Tim Robertson <timrobertson...@gmail.com>
> wrote:
>
> > Hi all,
> >
> > I'm seeing the following
> >   java.io.IOException: No FileSystem for scheme: file
> > at
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584)
> > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
> > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
> >         ...
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:778)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars(TableMapReduceUtil.java:707)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addDependencyJars(TableMapReduceUtil.java:752)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initTableMapperJob(TableMapReduceUtil.java:192)
> >         ....
> >
> > I have the hadoop hdfs jar on the CP, and I am submitting using all
> > mapreduce2 jar (e.g. Yarn).  I build a jar with dependencies (and also
> > provide the hdfs jar on the CP directly) and the dependency tree is:
> >   https://gist.github.com/timrobertson100/027f97d038df53cc836f
> >
> > It has all worked before, but I'm doing a migration to Yarn and 0.98 (CDH
> > 5.2.0) from 0.94 and MR1 so obviously have messed up the CP somehow.
> >
> > Has anyone come across this please?
> >
> > Thanks,
> > Tim
> >
>
>
>
> --
> Sean
>

Reply via email to