Re: Converting String to Datetime using map

2016-03-24 Thread Mich Talebzadeh
Thanks.

Tried this

 scala> val a = df.filter(col("Total") > "").map(p =>
Invoices(p(0).toString,
p(1).toString.TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(p(1),"dd/MM/"),"-MM-dd")),
p(2).toString.substring(1).replace(",", "").toDouble,
p(3).toString.substring(1).replace(",", "").toDouble,
p(4).toString.substring(1).replace(",", "").toDouble))
:23: error: value TO_DATE is not a member of String
   val a = df.filter(col("Total") > "").map(p =>
Invoices(p(0).toString,
p(1).toString.TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(p(1),"dd/MM/"),"-MM-dd")),
p(2).toString.substring(1).replace(",", "").toDouble,
p(3).toString.substring(1).replace(",", "").toDouble,
p(4).toString.substring(1).replace(",", "").toDouble))

^


^

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com



On 24 March 2016 at 22:00, Alexander Krasnukhin 
wrote:

> You can invoke exactly the same functions on scala side as well i.e.
> http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.functions$
>
> Have you tried them?
>
> On Thu, Mar 24, 2016 at 10:29 PM, Mich Talebzadeh <
> mich.talebza...@gmail.com> wrote:
>
>>
>> Hi,
>>
>> Read a CSV in with the following schema
>>
>> scala> df.printSchema
>> root
>>  |-- Invoice Number: string (nullable = true)
>>  |-- Payment date: string (nullable = true)
>>  |-- Net: string (nullable = true)
>>  |-- VAT: string (nullable = true)
>>  |-- Total: string (nullable = true)
>>
>> I use mapping as below
>>
>> case class Invoices(Invoicenumber: String, Paymentdate: String, Net:
>> Double, VAT: Double, Total: Double)
>>
>> val a = df.filter(col("Total") > "").map(p => Invoices(p(0).toString,
>> p(1).toString, p(2).toString.substring(1).replace(",", "").toDouble,
>> p(3).toString.substring(1).replace(",", "").toDouble,
>> p(4).toString.substring(1).replace(",", "").toDouble))
>>
>>
>> I want to convert p(1).toString to datetime like below when I used in sql
>>
>> TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(paymentdate,'dd/MM/'),'-MM-dd'))
>> AS paymentdate
>>
>>
>> Thanks
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> *
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>>
>
>
>
> --
> Regards,
> Alexander
>


Re: Converting String to Datetime using map

2016-03-24 Thread Alexander Krasnukhin
You can invoke exactly the same functions on scala side as well i.e.
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.functions$

Have you tried them?

On Thu, Mar 24, 2016 at 10:29 PM, Mich Talebzadeh  wrote:

>
> Hi,
>
> Read a CSV in with the following schema
>
> scala> df.printSchema
> root
>  |-- Invoice Number: string (nullable = true)
>  |-- Payment date: string (nullable = true)
>  |-- Net: string (nullable = true)
>  |-- VAT: string (nullable = true)
>  |-- Total: string (nullable = true)
>
> I use mapping as below
>
> case class Invoices(Invoicenumber: String, Paymentdate: String, Net:
> Double, VAT: Double, Total: Double)
>
> val a = df.filter(col("Total") > "").map(p => Invoices(p(0).toString,
> p(1).toString, p(2).toString.substring(1).replace(",", "").toDouble,
> p(3).toString.substring(1).replace(",", "").toDouble,
> p(4).toString.substring(1).replace(",", "").toDouble))
>
>
> I want to convert p(1).toString to datetime like below when I used in sql
>
> TO_DATE(FROM_UNIXTIME(UNIX_TIMESTAMP(paymentdate,'dd/MM/'),'-MM-dd'))
> AS paymentdate
>
>
> Thanks
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> *
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>



-- 
Regards,
Alexander