[ 
https://issues.apache.org/jira/browse/SPARK-12969?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15146728#comment-15146728
 ] 

Ankit Jindal commented on SPARK-12969:
--------------------------------------

Hi,
I have tried your code with Java 1.8.0_66 and spark 1.6 in local mode and it is 
working as expected.

Can you provide the command you are using to run this.

Regards,
Ankit

> Exception while  casting a spark supported date formatted "string" to "date" 
> data type.
> ---------------------------------------------------------------------------------------
>
>                 Key: SPARK-12969
>                 URL: https://issues.apache.org/jira/browse/SPARK-12969
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 1.6.0
>         Environment: Spark Java 
>            Reporter: Jais Sebastian
>
> Getting exception while  converting a string column( column is having spark 
> supported date format yyyy-MM-dd ) to date data type. Below is the code 
> snippet 
>         List<String> jsonData = Arrays.asList( 
> "{\"d\":\"2015-02-01\",\"n\":1}");
>         JavaRDD<String> dataRDD = 
> this.getSparkContext().parallelize(jsonData);
>         DataFrame data = this.getSqlContext().read().json(dataRDD);
>         DataFrame newData = data.select(data.col("d").cast("date"));
>         newData.show();
> Above code will give the error
> failed to compile: org.codehaus.commons.compiler.CompileException: File 
> generated.java, Line 95, Column 28: Expression "scala.Option < Long > 
> longOpt16" is not an lvalue
> This happens only if we execute the program in client mode , it works if we 
> execute through spark submit. Here is the sample project : 
> https://github.com/uhonnavarkar/spark_test



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to