HI Ted,
I understand it works fine if executed in Spark Shell
Sorry, I missed to mention that I am getting compile time error( using
Maven for build)
I am executing my Spark Job in remote client by submitting the exe jar file

Now do I need to import any specific packages make DataFrameNaFucntions
working

Hence please let me know if any inputs on the same to fix the issue

Regards,
Satish Chandra





On Mon, Feb 15, 2016 at 7:41 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> fill() was introduced in 1.3.1
>
> Can you show code snippet which reproduces the error ?
>
> I tried the following using spark-shell on master branch:
>
> scala> df.na.fill(0)
> res0: org.apache.spark.sql.DataFrame = [col: int]
>
> Cheers
>
> On Mon, Feb 15, 2016 at 3:36 AM, satish chandra j <
> jsatishchan...@gmail.com> wrote:
>
>> Hi All,
>> Currently I am using Spark 1.4.0 version, getting error when trying to
>> use "fill" function which is one among DataFrameNaFunctions
>>
>> Snippet:
>> df.na.fill(col: 0000)
>>
>> Error:
>> value na is not a member of org.apache.spark.sql.DataFrame
>>
>> As I need null values in column "col" of DataFrame "df" to be replaced
>> with value "0000" as given in the above snippet.
>>
>> I understand, code does not require any additional packages to support
>> DataFrameNaFunctions
>>
>> Please let me know if I am missing anything so that I can make these
>> DataFrameNaFunctions working
>>
>> Regards,
>> Satish Chandra J
>>
>
>

Reply via email to