[ https://issues.apache.org/jira/browse/SPARK-16634?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-16634: ------------------------------------ Assignee: (was: Apache Spark) > GenericArrayData can't be loaded in certain JVMs > ------------------------------------------------ > > Key: SPARK-16634 > URL: https://issues.apache.org/jira/browse/SPARK-16634 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.1.0 > Reporter: Marcelo Vanzin > Priority: Minor > > There's an annoying bug in some JVMs that causes certain scala-generated > bytecode to not load. The current code in GenericArrayData.scala triggers > that bug (at least with 1.7.0_67, maybe others). > Since it's easy to work around the bug, I'd rather do that instead of asking > people who might be running that version to have to upgrade. > Error: > {noformat} > 16/07/19 16:02:35 INFO scheduler.TaskSetManager: Lost task 0.2 in stage 0.0 > (TID 2) on executor vanzin-st1-3.vpc.cloudera.com: java.lang.VerifyError (Bad > <init> method call from inside of a branch > Exception Details: > Location: > > org/apache/spark/sql/catalyst/util/GenericArrayData.<init>(Ljava/lang/Object;)V > @52: invokespecial > Reason: > Error exists in the bytecode > Bytecode: > 0000000: 2a2b 4d2c c100 dc99 000e 2cc0 00dc 4e2d > 0000010: 3a04 a700 20b2 0129 2c04 b601 2d99 001b > 0000020: 2c3a 05b2 007a 1905 b600 7eb9 00fe 0100 > 0000030: 3a04 1904 b700 f3b1 bb01 2f59 2cb7 0131 > 0000040: bf > Stackmap Table: > > full_frame(@21,{UninitializedThis,Object[#177],Object[#177]},{UninitializedThis}) > > full_frame(@50,{UninitializedThis,Object[#177],Object[#177],Top,Object[#220]},{UninitializedThis}) > > full_frame(@56,{UninitializedThis,Object[#177],Object[#177]},{UninitializedThis}) > ) [duplicate 2] > {noformat} > I didn't run into this with 2.0, not sure whether the issue exists there. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org