Make sure if you are using 127.0.0.1 please check in /etc/hosts and uncheck
or create 127.0.1.1 named it as localhost

On Sat, Mar 21, 2015 at 9:57 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> bq. Caused by: java.net.UnknownHostException: dhcp-10-35-14-100: Name or
> service not known
>
> Can you check your DNS ?
>
> Cheers
>
> On Fri, Mar 20, 2015 at 8:54 PM, tangzilu <zilu.t...@hotmail.com> wrote:
>
>> Hi All:
>> I recently started to deploy Spark1.2 in my VisualBox Linux.
>> But when I run the command "./spark-shell" in the path of
>> "/opt/spark-1.2.1/bin", I got the result like this:
>>
>> [root@dhcp-10-35-14-100 bin]# ./spark-shell
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>> 15/03/20 13:56:06 INFO SecurityManager: Changing view acls to: root
>> 15/03/20 13:56:06 INFO SecurityManager: Changing modify acls to: root
>> 15/03/20 13:56:06 INFO SecurityManager: SecurityManager: authentication
>> disabled; ui acls disabled; users with view permissions: Set(root); users
>> with modify permissions: Set(root)
>> 15/03/20 13:56:06 INFO HttpServer: Starting HTTP Server
>> 15/03/20 13:56:06 INFO Utils: Successfully started service 'HTTP class
>> server' on port 47691.
>> Welcome to
>>       ____              __
>>      / __/__  ___ _____/ /__
>>     _\ \/ _ \/ _ `/ __/  '_/
>>    /___/ .__/\_,_/_/ /_/\_\   version 1.2.1
>>       /_/
>>
>> Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
>> Type in expressions to have them evaluated.
>> Type :help for more information.
>> java.net.UnknownHostException: dhcp-10-35-14-100: dhcp-10-35-14-100: Name
>> or service not known
>>         at java.net.InetAddress.getLocalHost(InetAddress.java:1473)
>>         at
>> org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:710)
>>         at
>> org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:702)
>>         at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:702)
>>         at org.apache.spark.HttpServer.uri(HttpServer.scala:158)
>>         at
>> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:982)
>>         at $iwC$$iwC.<init>(<console>:9)
>>         at $iwC.<init>(<console>:18)
>>         at <init>(<console>:20)
>>         at .<init>(<console>:24)
>>         at .<clinit>(<console>)
>>         at .<init>(<console>:7)
>>         at .<clinit>(<console>)
>>         at $print(<console>)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
>>         at
>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
>>         at
>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
>>         at
>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
>>         at
>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
>>         at
>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
>>         at
>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
>>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
>>         at
>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
>>         at
>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
>>         at
>> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:270)
>>         at
>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
>>         at
>> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:60)
>>         at
>> org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:945)
>>         at
>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:147)
>>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:60)
>>         at
>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>>         at
>> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:60)
>>         at
>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:962)
>>         at
>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>         at
>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
>>         at
>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
>>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
>>         at org.apache.spark.repl.Main$.main(Main.scala:31)
>>         at org.apache.spark.repl.Main.main(Main.scala)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>         at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> Caused by: java.net.UnknownHostException: dhcp-10-35-14-100: Name or
>> service not known
>>         at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
>>         at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901)
>>         at
>> java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293)
>>         at java.net.InetAddress.getLocalHost(InetAddress.java:1469)
>>         ... 50 more
>> I don't know what's the problem, for I just started it as shown in docs.
>> And "sc" could not be found.
>>
>> scala> sc
>> <console>:11: error: not found: value sc
>>               sc
>>               ^
>> Is there anything I could do to fix it?
>>
>> Thanks and regards!
>> Zilu.Tang
>>
>
>

Reply via email to