Yeah....... I made that mistake in spark/conf/spark-defaults.conf for
setting:  spark.eventLog.dir.
Now it works....

Thank you
Karthik


On Mon, Jan 19, 2015 at 3:29 PM, Sean Owen <so...@cloudera.com> wrote:

> Sorry, to be clear, you need to write "hdfs:///home/...". Note three
> slashes; there is an empty host between the 2nd and 3rd. This is true
> of most URI schemes with a host.
>
> On Mon, Jan 19, 2015 at 9:56 AM, Rapelly Kartheek
> <kartheek.m...@gmail.com> wrote:
> > Yes yes.. hadoop/etc/hadoop/hdfs-site.xml file has the path like:
> > "hdfs://home/..."
> >
> > On Mon, Jan 19, 2015 at 3:21 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> I bet somewhere you have a path like "hdfs://home/..." which would
> >> suggest that 'home' is a hostname, when I imagine you mean it as a
> >> root directory.
> >>
> >> On Mon, Jan 19, 2015 at 9:33 AM, Rapelly Kartheek
> >> <kartheek.m...@gmail.com> wrote:
> >> > Hi,
> >> >
> >> > I get the following exception when I run my application:
> >> >
> >> > karthik@karthik:~/spark-1.2.0$ ./bin/spark-submit --class
> >> > org.apache.spark.examples.SimpleApp001 --deploy-mode client --master
> >> > spark://karthik:7077
> $SPARK_HOME/examples/*/scala-*/spark-examples-*.jar
> >> >>out1.txt
> >> > log4j:WARN No such property [target] in org.apache.log4j.FileAppender.
> >> > Exception in thread "main" java.lang.IllegalArgumentException:
> >> > java.net.UnknownHostException: home
> >> >     at
> >> >
> >> >
> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:377)
> >> >     at
> >> >
> >> >
> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:237)
> >> >     at
> >> >
> >> >
> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:141)
> >> >     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:569)
> >> >     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:512)
> >> >     at
> >> >
> >> >
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:142)
> >> >     at
> >> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2316)
> >> >     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:366)
> >> >     at org.apache.spark.util.FileLogger.<init>(FileLogger.scala:90)
> >> >     at
> >> >
> >> >
> org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:63)
> >> >     at org.apache.spark.SparkContext.<init>(SparkContext.scala:352)
> >> >     at
> >> > org.apache.spark.examples.SimpleApp001$.main(SimpleApp001.scala:13)
> >> >     at org.apache.spark.examples.SimpleApp001.main(SimpleApp001.scala)
> >> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> >     at
> >> >
> >> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >> >     at
> >> >
> >> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> >     at java.lang.reflect.Method.invoke(Method.java:606)
> >> >     at
> >> > org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
> >> >     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
> >> >     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >> > Caused by: java.net.UnknownHostException: home
> >> >     ... 20 more
> >> >
> >> >
> >> > I couldn't trace the cause of this exception. Any help in this regard?
> >> >
> >> > Thanks
> >
> >
>

Reply via email to