I can do it in scala api, but not sure what's the syntax in pyspark.
(Didn't find it in python api)

Here's what I tried, both failed

>>> df.filter(df.age>3 & df.name=="Andy").collect()
>>> df.filter(df.age>3 and df.name=="Andy").collect()

-- 
Best Regards

Jeff Zhang

Reply via email to