actually, i just came into same problem. Whether you can share some code around 
the error, then I can figure it out whether I can help you. And the 
"s001.bigdata” is your name of name node?
> On 2016年8月2日, at 17:22, pseudo oduesp <pseudo20...@gmail.com> wrote:
> 
> someone can help me please ????
> 
> 2016-08-01 11:51 GMT+02:00 pseudo oduesp <pseudo20...@gmail.com 
> <mailto:pseudo20...@gmail.com>>:
> hi 
> i get the following erreors when i try using pyspark 2.0 with ipython   on 
> yarn 
> somone can help me please .
> java.lang.IllegalArgumentException: java.net.UnknownHostException: 
> s001.bigdata.;s003.bigdata;s008bigdata.
>         at 
> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
>         at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.getDelegationTokenService(KMSClientProvider.java:823)
>         at 
> org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:779)
>         at 
> org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.addDelegationTokens(KeyProviderDelegationTokenExtension.java:86)
>         at 
> org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2046)
>         at 
> org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$$anonfun$obtainTokensForNamenodes$1.apply(YarnSparkHadoopUtil.scala:133)
>         at 
> org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$$anonfun$obtainTokensForNamenodes$1.apply(YarnSparkHadoopUtil.scala:130)
>         at scala.collection.immutable.Set$Set1.foreach(Set.scala:94)
>         at 
> org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.obtainTokensForNamenodes(YarnSparkHadoopUtil.scala:130)
>         at 
> org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:367)
>         at 
> org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:834)
>         at 
> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167)
>         at 
> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
>         at 
> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
>         at 
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>         at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>         at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>         at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
>         at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
>         at py4j.Gateway.invoke(Gateway.java:236)
>         at 
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
>         at 
> py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
>         at py4j.GatewayConnection.run(GatewayConnection.java:211)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: 
> java.net.UnknownHostException:s001.bigdata.;s003.bigdata;s008bigdata.
> 
> 
> thanks 
> 

Reply via email to