[ 
https://issues.apache.org/jira/browse/SPARK-18823?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15744519#comment-15744519
 ] 

Shivaram Venkataraman commented on SPARK-18823:
-----------------------------------------------

Ah I see your point - `withColumn` does work for this use case ? I agree that 
adding this to `[ <- ` or `[[ <- ` would be a better user experience

{code}
> df2 <- withColumn(df, "tmp1", df$eruptions)
> head(df2)
  eruptions waiting  tmp1
1     3.600      79 3.600
...
{code}

> Assignation by column name variable not available or bug?
> ---------------------------------------------------------
>
>                 Key: SPARK-18823
>                 URL: https://issues.apache.org/jira/browse/SPARK-18823
>             Project: Spark
>          Issue Type: Question
>          Components: SparkR
>    Affects Versions: 2.0.2
>         Environment: RStudio Server in EC2 Instances (EMR Service of AWS) Emr 
> 4. Or databricks (community.cloud.databricks.com) .
>            Reporter: Vicente Masip
>             Fix For: 2.0.2
>
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I really don't know if this is a bug or can be done with some function:
> Sometimes is very important to assign something to a column which name has to 
> be access trough a variable. Normally, I have always used it with doble 
> brackets likes this out of SparkR problems:
> # df could be faithful normal data frame or data table.
> # accesing by variable name:
> myname = "waiting"
> df[[myname]] <- c(1:nrow(df))
> # or even column number
> df[[2]] <- df$eruptions
> The error is not caused by the right side of the "<-" operator of assignment. 
> The problem is that I can't assign to a column name using a variable or 
> column number as I do in this examples out of spark. Doesn't matter if I am 
> modifying or creating column. Same problem.
> I have also tried to use this with no results:
> val df2 = withColumn(df,"tmp", df$eruptions)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to