[ https://issues.apache.org/jira/browse/SPARK-20862?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-20862: ------------------------------------ Assignee: (was: Apache Spark) > LogisticRegressionModel throws TypeError > ---------------------------------------- > > Key: SPARK-20862 > URL: https://issues.apache.org/jira/browse/SPARK-20862 > Project: Spark > Issue Type: Bug > Components: MLlib, PySpark > Affects Versions: 2.1.1 > Reporter: Bago Amirbekian > Priority: Minor > > LogisticRegressionModel throws a TypeError using python3 and numpy 1.12.1: > ********************************************************************** > File "/Users/bago/repos/spark/python/pyspark/mllib/classification.py", line > 155, in __main__.LogisticRegressionModel > Failed example: > mcm = LogisticRegressionWithLBFGS.train(data, iterations=10, numClasses=3) > Exception raised: > Traceback (most recent call last): > File > "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/doctest.py", > line 1330, in __run > compileflags, 1), test.globs) > File "<doctest __main__.LogisticRegressionModel[23]>", line 1, in > <module> > mcm = LogisticRegressionWithLBFGS.train(data, iterations=10, > numClasses=3) > File "/Users/bago/repos/spark/python/pyspark/mllib/classification.py", > line 398, in train > return _regression_train_wrapper(train, LogisticRegressionModel, > data, initialWeights) > File "/Users/bago/repos/spark/python/pyspark/mllib/regression.py", line > 216, in _regression_train_wrapper > return modelClass(weights, intercept, numFeatures, numClasses) > File "/Users/bago/repos/spark/python/pyspark/mllib/classification.py", > line 176, in __init__ > self._dataWithBiasSize) > TypeError: 'float' object cannot be interpreted as an integer -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org