pyspark.sql.Column.between¶
-
Column.
between
(lowerBound, upperBound)[source]¶ True if the current column is between the lower bound and upper bound, inclusive.
New in version 1.3.0.
Examples
>>> df.select(df.name, df.age.between(2, 4)).show() +-----+---------------------------+ | name|((age >= 2) AND (age <= 4))| +-----+---------------------------+ |Alice| true| | Bob| false| +-----+---------------------------+