[ https://issues.apache.org/jira/browse/SPARK-6045?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-6045: ----------------------------- Priority: Trivial (was: Major) Yeah, this replaces an NPE-hiding-an-NPE with a clearer IllegalArgumentException, which is good, but that's all it does. This still dies with an exception, which is I suppose the right-est thing Spark can do since the impl isn't working. > RecordWriter should be checked against null in > PairRDDFunctions#saveAsNewAPIHadoopDataset > ----------------------------------------------------------------------------------------- > > Key: SPARK-6045 > URL: https://issues.apache.org/jira/browse/SPARK-6045 > Project: Spark > Issue Type: Bug > Reporter: Ted Yu > Priority: Trivial > > gtinside reported in the thread 'NullPointerException in TaskSetManager' with > the following stack trace: > {code} > WARN 2015-02-26 14:21:43,217 [task-result-getter-0] TaskSetManager - Lost > task 14.2 in stage 0.0 (TID 29, devntom003.dev.blackrock.com): > java.lang.NullPointerException > org.apache.spark.rdd.PairRDDFunctions.saveAsNewAPIHadoopDataset(PairRDDFunctions.scala:1007) > com.bfm.spark.test.CassandraHadoopMigrator$.main(CassandraHadoopMigrator.scala:77) > com.bfm.spark.test.CassandraHadoopMigrator.main(CassandraHadoopMigrator.scala) > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > java.lang.reflect.Method.invoke(Method.java:606) > org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358) > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) > org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > {code} > Looks like the following call in finally block was the cause: > {code} > writer.close(hadoopContext) > {code} > We should check writer against null before calling close(). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org