[apache-spark] How to check if spark dataframe is empty?

dataframe.limit(1).count > 0

This also triggers a job but since we are selecting single record, even in case of billion scale records the time consumption could be much lower.

From: https://medium.com/checking-emptiness-in-distributed-objects/count-vs-isempty-surprised-to-see-the-impact-fa70c0246ee0