[ 
https://issues.apache.org/jira/browse/SPARK-18085?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16158247#comment-16158247
 ] 

jincheng commented on SPARK-18085:
----------------------------------


{code:java}
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized 
field "metadata" (class org.apache.spark.sql.execution.SparkPlanInfo), not 
marked as ignorable (4 known properties: "simpleString", "nodeName", 
"children", "metrics"])
 at [Source: 
{"Event":"org.apache.spark.sql.execution.ui.SparkListenerSQLExecutionStart","executionId":0,"description":"json
 at 
NativeMethodAccessorImpl.java:0","details":"org.apache.spark.sql.DataFrameWriter.json(DataFrameWriter.scala:487)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native
 
Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:498)\npy4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)\npy4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)\npy4j.Gateway.invoke(Gateway.java:280)\npy4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)\npy4j.commands.CallCommand.execute(CallCommand.java:79)\npy4j.GatewayConnection.run(GatewayConnection.java:214)\njava.lang.Thread.run(Thread.java:748)","physicalPlanDescription":"==
 Parsed Logical Plan ==\nRepartition 200, true\n+- LogicalRDD [uid#327L, 
gids#328]\n\n== Analyzed Logical Plan ==\nuid: bigint, gids: 
array<bigint>\nRepartition 200, true\n+- LogicalRDD [uid#327L, gids#328]\n\n== 
Optimized Logical Plan ==\nRepartition 200, true\n+- LogicalRDD [uid#327L, 
gids#328]\n\n== Physical Plan ==\nExchange RoundRobinPartitioning(200)\n+- Scan 
ExistingRDD[uid#327L,gids#328]","sparkPlanInfo":{"nodeName":"Exchange","simpleString":"Exchange
 
RoundRobinPartitioning(200)","children":[{"nodeName":"ExistingRDD","simpleString":"Scan
 
ExistingRDD[uid#327L,gids#328]","children":[],"metadata":{},"metrics":[{"name":"number
 of output 
rows","accumulatorId":140,"metricType":"sum"}]}],"metadata":{},"metrics":[{"name":"data
 size total (min, med, 
max)","accumulatorId":139,"metricType":"size"}]},"time":1504837052948}; line: 
1, column: 1622] (through reference chain: 
org.apache.spark.sql.execution.ui.SparkListenerSQLExecutionStart["sparkPlanInfo"]->org.apache.spark.sql.execution.SparkPlanInfo["children"]->com.fasterxml.jackson.module.scala.deser.BuilderWrapper[0]->org.apache.spark.sql.execution.SparkPlanInfo["metadata"])
        at 
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:51)
        at 
com.fasterxml.jackson.databind.DeserializationContext.reportUnknownProperty(DeserializationContext.java:839)
        at 
com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:1045)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1352)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperties(BeanDeserializerBase.java:1306)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:399)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:133)
        at 
com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:245)
        at 
com.fasterxml.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:217)
        at 
com.fasterxml.jackson.module.scala.deser.SeqDeserializer.deserialize(SeqDeserializerModule.scala:76)
        at 
com.fasterxml.jackson.module.scala.deser.SeqDeserializer.deserialize(SeqDeserializerModule.scala:59)
        at 
com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeWithErrorWrapping(BeanDeserializer.java:463)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:378)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:133)
        at 
com.fasterxml.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:520)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeWithErrorWrapping(BeanDeserializer.java:463)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:378)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1099)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:296)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeOther(BeanDeserializer.java:166)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:136)
        at 
com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer._deserializeTypedForId(AsPropertyTypeDeserializer.java:122)
        at 
com.fasterxml.jackson.databind.jsontype.impl.AsPropertyTypeDeserializer.deserializeTypedFromObject(AsPropertyTypeDeserializer.java:93)
        at 
com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeWithType(BeanDeserializerBase.java:992)
        at 
com.fasterxml.jackson.databind.deser.impl.TypeWrappedDeserializer.deserialize(TypeWrappedDeserializer.java:42)
        at 
com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3736)
        at 
com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2726)
        at 
org.apache.spark.util.JsonProtocol$.sparkEventFromJson(JsonProtocol.scala:541)
        at 
org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:85)
        at 
org.apache.spark.scheduler.ReplayListenerBus.replay(ReplayListenerBus.scala:58)
        at 
org.apache.spark.deploy.history.FsHistoryProvider.replay(FsHistoryProvider.scala:706)
        at 
org.apache.spark.deploy.history.FsHistoryProvider.rebuildAppStore(FsHistoryProvider.scala:729)
        at 
org.apache.spark.deploy.history.FsHistoryProvider.createInMemoryStore(FsHistoryProvider.scala:873)
        at 
org.apache.spark.deploy.history.FsHistoryProvider.getAppUI(FsHistoryProvider.scala:318)
        at 
org.apache.spark.deploy.history.HistoryServer.getAppUI(HistoryServer.scala:170)
        at 
org.apache.spark.deploy.history.ApplicationCache$$anonfun$1.apply(ApplicationCache.scala:192)
        at 
org.apache.spark.deploy.history.ApplicationCache$$anonfun$1.apply(ApplicationCache.scala:190)
        at 
org.apache.spark.deploy.history.ApplicationCache.time(ApplicationCache.scala:164)
        at 
org.apache.spark.deploy.history.ApplicationCache.org$apache$spark$deploy$history$ApplicationCache$$loadApplicationEntry(ApplicationCache.scala:190)
        at 
org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:61)
        at 
org.apache.spark.deploy.history.ApplicationCache$$anon$1.load(ApplicationCache.scala:57)
        at 
org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
        at 
org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
        at 
org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
        at 
org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
        at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
        at 
org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
        at 
org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
        at 
org.apache.spark.deploy.history.ApplicationCache.get(ApplicationCache.scala:102)
        at 
org.apache.spark.deploy.history.ApplicationCache.withSparkUI(ApplicationCache.scala:110)
        at 
org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$$loadAppUi(HistoryServer.scala:227)
        at 
org.apache.spark.deploy.history.HistoryServer$$anon$1.doGet(HistoryServer.scala:86)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
        at 
org.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:848)
        at 
org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:584)
        at 
org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
        at 
org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
        at 
org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
        at 
org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
        at 
org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:493)
        at 
org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
        at 
org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
        at org.spark_project.jetty.server.Server.handle(Server.java:534)
        at 
org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:320)
        at 
org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
        at 
org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
        at 
org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:108)
        at 
org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
        at 
org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
        at 
org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
        at 
org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
        at 
org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
        at 
org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
        at java.lang.Thread.run(Thread.java:748)
{code}

we met this error , I think it was fixed on shs-ng-m9.


> SPIP: Better History Server scalability for many / large applications
> ---------------------------------------------------------------------
>
>                 Key: SPARK-18085
>                 URL: https://issues.apache.org/jira/browse/SPARK-18085
>             Project: Spark
>          Issue Type: Umbrella
>          Components: Spark Core, Web UI
>    Affects Versions: 2.0.0
>            Reporter: Marcelo Vanzin
>              Labels: SPIP
>         Attachments: screenshot-1.png, screenshot-2.png, spark_hs_next_gen.pdf
>
>
> It's a known fact that the History Server currently has some annoying issues 
> when serving lots of applications, and when serving large applications.
> I'm filing this umbrella to track work related to addressing those issues. 
> I'll be attaching a document shortly describing the issues and suggesting a 
> path to how to solve them.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to