[ https://issues.apache.org/jira/browse/SPARK-21285?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-21285: ------------------------------------ Assignee: Apache Spark > VectorAssembler should report the column name when data type used is not > supported > ---------------------------------------------------------------------------------- > > Key: SPARK-21285 > URL: https://issues.apache.org/jira/browse/SPARK-21285 > Project: Spark > Issue Type: Improvement > Components: ML, MLlib > Affects Versions: 2.1.1 > Reporter: Jacek Laskowski > Assignee: Apache Spark > Priority: Minor > > Found while answering [Why does LogisticRegression fail with > “IllegalArgumentException: > org.apache.spark.ml.linalg.VectorUDT@3bfc3ba7”?|https://stackoverflow.com/q/44844793/1305344] > on StackOverflow. > When {{VectorAssembler}} is configured to use columns of unsupported type > only the type is printed out without the column name(s). > The column name(s) should be included too. > {code} > // label is of StringType type > val va = new VectorAssembler().setInputCols(Array("bc", "pmi", "label")) > scala> va.transform(training) > java.lang.IllegalArgumentException: Data type StringType is not supported. > at > org.apache.spark.ml.feature.VectorAssembler$$anonfun$transformSchema$1.apply(VectorAssembler.scala:121) > at > org.apache.spark.ml.feature.VectorAssembler$$anonfun$transformSchema$1.apply(VectorAssembler.scala:117) > at > scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) > at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) > at > org.apache.spark.ml.feature.VectorAssembler.transformSchema(VectorAssembler.scala:117) > at org.apache.spark.ml.PipelineStage.transformSchema(Pipeline.scala:74) > at > org.apache.spark.ml.feature.VectorAssembler.transform(VectorAssembler.scala:54) > ... 48 elided > {code} -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org