[apache-spark] Filtering a spark dataframe based on date

I find the most readable way to express this is using a sql expression:

df.filter("my_date < date'2015-01-01'")

we can verify this works correctly by looking at the physical plan from .explain()

+- *(1) Filter (isnotnull(my_date#22) && (my_date#22 < 16436))