[ https://issues.apache.org/jira/browse/SPARK-27087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16787732#comment-16787732 ]
Hyukjin Kwon commented on SPARK-27087: -------------------------------------- Scala side shows the same output as well: {code} scala> import org.apache.spark.sql.functions._ import org.apache.spark.sql.functions._ scala> lit(1).alias("A").toString res2: String = 1 AS `A` {code} It shows the expression as is. So it's correct to show the expression in string as is. not a bug or missing feature. > Inability to access to column alias in pyspark > ---------------------------------------------- > > Key: SPARK-27087 > URL: https://issues.apache.org/jira/browse/SPARK-27087 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.4.0 > Reporter: Vincent > Priority: Minor > > In pyspark I have the following: > {code:java} > import pyspark.sql.functions as F > cc = F.lit(1).alias("A") > print(cc) > print(cc._jc.toString()) > {code} > I get : > {noformat} > Column<b'1 AS `A`'> > 1 AS `A` > {noformat} > Is there any way for me to just print "A" from cc ? it seems I'm unable to > extract the alias programatically from the column object. > Also I think that in spark-sql in scala, if I print "cc" it would just print > "A" instead, so this seem like a bug or a missing feature to me -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org