[ https://issues.apache.org/jira/browse/SPARK-13815?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15198261#comment-15198261 ]
Apache Spark commented on SPARK-13815: -------------------------------------- User 'joan38' has created a pull request for this issue: https://github.com/apache/spark/pull/11772 > Provide better Exception messages in Pipeline load methods > ---------------------------------------------------------- > > Key: SPARK-13815 > URL: https://issues.apache.org/jira/browse/SPARK-13815 > Project: Spark > Issue Type: Improvement > Components: MLlib > Affects Versions: 2.0.0 > Environment: today's build of 2.0.0-SNAPSHOT > Reporter: Jacek Laskowski > Priority: Minor > > The following code that loads a {{Pipeline}} from an empty {{metadata}} file > throws an exception (expected) that says nothing about the real cause of it. > {code} > $ ls -l hello-pipeline/metadata > -rw-r--r-- 1 jacek staff 0 11 mar 09:00 hello-pipeline/metadata > scala> Pipeline.read.load("hello-pipeline") > ... > java.lang.UnsupportedOperationException: empty collection > at org.apache.spark.rdd.RDD$$anonfun$first$1.apply(RDD.scala:1344) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:316) > at org.apache.spark.rdd.RDD.first(RDD.scala:1341) > at > org.apache.spark.ml.util.DefaultParamsReader$.loadMetadata(ReadWrite.scala:285) > at > org.apache.spark.ml.Pipeline$SharedReadWrite$.load(Pipeline.scala:253) > at > org.apache.spark.ml.Pipeline$PipelineReader.load(Pipeline.scala:203) > at > org.apache.spark.ml.Pipeline$PipelineReader.load(Pipeline.scala:197) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org