Already tried it. But getting following error.

overloaded method value filter with alternatives: (conditionExpr:
String)org.apache.spark.sql.DataFrame <and> (condition:
org.apache.spark.sql.Column)org.apache.spark.sql.DataFrame cannot be
applied to (Boolean)

Also tried:

val req_logs_with_dpid =
req_logs.filter(req_logs("req_info.dpid").toString.length
!= 0 )

But getting same error.


On Wed, Dec 9, 2015 at 6:45 PM, Fengdong Yu <fengdo...@everstring.com>
wrote:

> val req_logs_with_dpid = req_logs.filter(req_logs("req_info.pid") != "" )
>
> Azuryy Yu
> Sr. Infrastructure Engineer
>
> cel: 158-0164-9103
> wetchat: azuryy
>
>
> On Wed, Dec 9, 2015 at 7:43 PM, Prashant Bhardwaj <
> prashant2006s...@gmail.com> wrote:
>
>> Hi
>>
>> I have two columns in my json which can have null, empty and non-empty
>> string as value.
>> I know how to filter records which have non-null value using following:
>>
>> val req_logs = sqlContext.read.json(filePath)
>>
>> val req_logs_with_dpid = req_log.filter("req_info.dpid is not null or
>> req_info.dpid_sha1 is not null")
>>
>> But how to filter if value of column is empty string?
>> --
>> Regards
>> Prashant
>>
>
>


-- 
Regards
Prashant

Reply via email to