Executor: Spark’s tasks can be processed by the workers.
Driver: Running the main method() program to create RDDs. It performs the actions and transformations on them.
Cluster Manager: Launching the executors and drivers can be made with a Spark’s pluggable component that is called cluster manager. This cluster manager is used to run like YARN or Apache Mesos on top of the external manager.