From the excellent resources available at RStudio's Sparklyr package page:
SPARK DEFINITIONS:
It may be useful to provide some simple definitions for the Spark nomenclature:
Node: A server
Worker Node: A server that is part of the cluster and are available to run Spark jobs
Master Node: The server that coordinates the Worker nodes.
Executor: A sort of virtual machine inside a node. One Node can have multiple Executors.
Driver Node: The Node that initiates the Spark session. Typically, this will be the server where sparklyr is located.
Driver (Executor): The Driver Node will also show up in the Executor list.