[ 
https://issues.apache.org/jira/browse/SPARK-38208?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Joyce Arruda Recacho resolved SPARK-38208.
------------------------------------------
    Fix Version/s: 3.1.2
       Resolution: Fixed

> 'Column' object is not callable
> -------------------------------
>
>                 Key: SPARK-38208
>                 URL: https://issues.apache.org/jira/browse/SPARK-38208
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 3.1.2, 3.2.1
>            Reporter: Joyce Arruda Recacho
>            Priority: Major
>             Fix For: 3.1.2
>
>
> Hi guys, I have such simple dataframe and am trying to create one new column.
> That its schema:
>  
> >>>> df_operation_event_sellers.schema 
> Out[69]: 
> StructType(List(StructField(id,StringType,true),StructField(account_id,StringType,true),StructField(p_tenant_id,StringType,true),StructField(vendor_id,StringType,true),StructField(amount,DecimalType(38,18),true),StructField(operation_type,StringType,true),StructField(reference_id,StringType,true),StructField(date,TimestampType,true),StructField(carrier_id,StringType,true),StructField(account_number,StringType,true),StructField(data_source,StringType,true),StructField(entity,StringType,true),StructField(ingestion_date,DateType,true),StructField(event_type,StringType,false),StructField(amount_new,DecimalType(38,18),true),StructField(date_new,IntegerType,true),StructField(row_num,IntegerType,true)))
>  
> >>> command to create the new column
> df_operation_event_sellers= 
> df_operation_event_sellers.withColumn('flag_first_selling',when(col('row_num')
>  == 1,'YES').instead('NO'))
> ISSUE >>>>>>>>>>>> TypeError: 'Column' object is not callable
>  
> What is happing?
> ps. I created other columns the same way successfully
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to