[ https://issues.apache.org/jira/browse/SPARK-10067?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Pinyol closed SPARK-10067. --------------------------------- Resolution: Not A Problem Fixed after upgrading to JDK 1.8.0._60. Probably due to http://bugs.java.com/view_bug.do?bug_id=8077102 > Long delay (16 seconds) when running local session on offline machine > --------------------------------------------------------------------- > > Key: SPARK-10067 > URL: https://issues.apache.org/jira/browse/SPARK-10067 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.4.1 > Environment: Mac 10.10.5, java 1.8.0_51 from IntelliJ 14.1 > Reporter: Daniel Pinyol > Priority: Minor > > If I run this > {code:java} > SparkContext sc = new SparkContext("local", "test"); > {code} > on a machine with no network, it hangs during 15 or 16 seconds during this > point, and then it successfully resumes. Looks like the problem is when > checking the kerberos realm (see callstack below). > Is there anyway to avoid this annoying delay? I reviewed > https://spark.apache.org/docs/latest/configuration.html, but couldn't find > any solution. > thanks > {noformat} > "main@1" prio=5 tid=0x1 nid=NA runnable > java.lang.Thread.State: RUNNABLE > at > java.net.PlainDatagramSocketImpl.peekData(PlainDatagramSocketImpl.java:-1) > - locked <0x758> (a java.net.PlainDatagramSocketImpl) > at java.net.DatagramSocket.receive(DatagramSocket.java:787) > - locked <0x732> (a java.net.DatagramSocket) > - locked <0x759> (a java.net.DatagramPacket) > at com.sun.jndi.dns.DnsClient.doUdpQuery(DnsClient.java:413) > at com.sun.jndi.dns.DnsClient.query(DnsClient.java:207) > at com.sun.jndi.dns.Resolver.query(Resolver.java:81) > at com.sun.jndi.dns.DnsContext.c_getAttributes(DnsContext.java:434) > at > com.sun.jndi.toolkit.ctx.ComponentDirContext.p_getAttributes(ComponentDirContext.java:235) > at > com.sun.jndi.toolkit.ctx.PartialCompositeDirContext.getAttributes(PartialCompositeDirContext.java:141) > at > com.sun.jndi.toolkit.url.GenericURLDirContext.getAttributes(GenericURLDirContext.java:103) > at > sun.security.krb5.KrbServiceLocator.getKerberosService(KrbServiceLocator.java:85) > at sun.security.krb5.Config.checkRealm(Config.java:1120) > at sun.security.krb5.Config.getRealmFromDNS(Config.java:1093) > at sun.security.krb5.Config.getDefaultRealm(Config.java:987) > at > sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:483) > at > org.apache.hadoop.security.authentication.util.KerberosUtil.getDefaultRealm(KerberosUtil.java:75) > at > org.apache.hadoop.security.authentication.util.KerberosName.<clinit>(KerberosName.java:85) > at > org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:225) > - locked <0x57d> (a java.lang.Class) > at > org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214) > at > org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669) > at > org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:571) > at > org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162) > at > org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2162) > at scala.Option.getOrElse(Option.scala:121) > at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2162) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:301) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:155) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:170) > at DataFrameSandbox.<init>(DataFrameSandbox.java:31) > at DataFrameSandbox.main(DataFrameSandbox.java:45) > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org