try df.select($"col".cast(DoubleType))

import org.apache.spark.sql.types._

val df = spark.sparkContext.parallelize(Seq(("1.04"))).toDF("c")

df.select($"c".cast(DoubleType))

On Thu, Sep 14, 2017 at 9:20 PM, KhajaAsmath Mohammed <
mdkhajaasm...@gmail.com> wrote:

> Hi,
>
> I am getting below error when trying to cast column value from spark
> dataframe to double. any issues. I tried many solutions but none of them
> worked.
>
>  java.lang.ClassCastException: java.lang.String cannot be cast to
> java.lang.Double
>
> 1. row.getAs[Double](Constants.Datapoint.Latitude)
>
> 2. row.getAs[String](Constants.Datapoint.Latitude).toDouble
>
> I dont want to use row.getDouble(0) as position of column in file keeps on
> change.
>
> Thanks,
> Asmath
>



-- 
Ram Sriharsha
Product Manager, Apache Spark
PPMC Member and Committer, Apache Spark
Databricks
San Francisco, CA
Ph: 408-510-8635
email: har...@apache.org

[image: https://www.linkedin.com/in/harsha340]
<https://www.linkedin.com/in/harsha340> <https://twitter.com/halfabrane>
<https://github.com/harsha2010/>

Reply via email to