[jira] [Commented] (SPARK-25397) SparkSession.conf fails when given default value with Python 3
[ https://issues.apache.org/jira/browse/SPARK-25397?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16612994#comment-16612994 ] Hyukjin Kwon commented on SPARK-25397: -- [~josephkb], do you want to backport this bit or just resolve this? Either way sounds okay to me. > SparkSession.conf fails when given default value with Python 3 > -- > > Key: SPARK-25397 > URL: https://issues.apache.org/jira/browse/SPARK-25397 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.1 >Reporter: Joseph K. Bradley >Priority: Minor > > Spark 2.3.1 has a Python 3 incompatibility when requesting a Conf value from > SparkSession when you give non-string default values. Reproduce via > SparkSession call: > {{spark.conf.get("myConf", False)}} > This gives the error: > {code} > >>> spark.conf.get("myConf", False) > Traceback (most recent call last): > File "", line 1, in > File > "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", > line 51, in get > self._checkType(default, "default") > File > "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", > line 62, in _checkType > if not isinstance(obj, str) and not isinstance(obj, unicode): > *NameError: name 'unicode' is not defined* > {code} > The offending line in Spark in branch-2.3 is: > https://github.com/apache/spark/blob/branch-2.3/python/pyspark/sql/conf.py > which uses the value {{unicode}} which is not available in Python 3. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-25397) SparkSession.conf fails when given default value with Python 3
[ https://issues.apache.org/jira/browse/SPARK-25397?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16609928#comment-16609928 ] Hyukjin Kwon commented on SPARK-25397: -- That's fixed in https://github.com/apache/spark/commit/71f38ac242157cbede684546159f2a27892ee09f [~josephkb]: {code} >>> spark.conf.get("myConf", False) Traceback (most recent call last): File "", line 1, in File "/.../spark/python/pyspark/sql/conf.py", line 54, in get self._checkType(default, "default") File "/.../spark/python/pyspark/sql/conf.py", line 67, in _checkType (identifier, obj, type(obj).__name__)) TypeError: expected default 'False' to be a string (was 'bool') {code} Just for clarification, looks about fixing error message, not the bug fix though .. > SparkSession.conf fails when given default value with Python 3 > -- > > Key: SPARK-25397 > URL: https://issues.apache.org/jira/browse/SPARK-25397 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.1 >Reporter: Joseph K. Bradley >Priority: Minor > > Spark 2.3.1 has a Python 3 incompatibility when requesting a Conf value from > SparkSession when you give non-string default values. Reproduce via > SparkSession call: > {{spark.conf.get("myConf", False)}} > This gives the error: > {code} > >>> spark.conf.get("myConf", False) > Traceback (most recent call last): > File "", line 1, in > File > "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", > line 51, in get > self._checkType(default, "default") > File > "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", > line 62, in _checkType > if not isinstance(obj, str) and not isinstance(obj, unicode): > *NameError: name 'unicode' is not defined* > {code} > The offending line in Spark in branch-2.3 is: > https://github.com/apache/spark/blob/branch-2.3/python/pyspark/sql/conf.py > which uses the value {{unicode}} which is not available in Python 3. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-25397) SparkSession.conf fails when given default value with Python 3
[ https://issues.apache.org/jira/browse/SPARK-25397?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16609593#comment-16609593 ] Joseph K. Bradley commented on SPARK-25397: --- CC [~smilegator], [~cloud_fan] for visibility > SparkSession.conf fails when given default value with Python 3 > -- > > Key: SPARK-25397 > URL: https://issues.apache.org/jira/browse/SPARK-25397 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 2.3.1 >Reporter: Joseph K. Bradley >Priority: Major > > Spark 2.3.1 has a Python 3 incompatibility when requesting a Conf value from > SparkSession when you give non-string default values. Reproduce via > SparkSession call: > {{spark.conf.get("myConf", False)}} > This gives the error: > {code} > >>> spark.conf.get("myConf", False) > Traceback (most recent call last): > File "", line 1, in > File > "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", > line 51, in get > self._checkType(default, "default") > File > "/Users/josephkb/work/spark-bin/spark-2.3.1-bin-hadoop2.7/python/pyspark/sql/conf.py", > line 62, in _checkType > if not isinstance(obj, str) and not isinstance(obj, unicode): > *NameError: name 'unicode' is not defined* > {code} > The offending line in Spark in branch-2.3 is: > https://github.com/apache/spark/blob/branch-2.3/python/pyspark/sql/conf.py > which uses the value {{unicode}} which is not available in Python 3. -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org