are you trying to do dataframe boolean expression? please use '&' for 'and', '|' for 'or', '~' for 'not' when building DataFrame boolean expressions.
example: >>> df = sqlContext.range(10) >>> df.where( (df.id==1) | ~(df.id==1)) DataFrame[id: bigint] On Wed, Dec 16, 2015 at 4:32 PM, Allen Zhang <allenzhang...@126.com> wrote: > Hi All, > > does spark label expression really support "&&" or "||" or even "!" for > label based schedulering? > I tried that but it does NOT work. > > Best Regards, > Allen > > -- -- 張雅軒