0 votes
in Spark Sql by
What are the core components in a distributed Spark application in Spark?

1 Answer

0 votes
by
Executor: Spark’s tasks can be processed by the workers.

Driver: Running the main method() program to create RDDs. It performs the actions and transformations on them.

Cluster Manager: Launching the executors and drivers can be made with a Spark’s pluggable component that is called cluster manager. This cluster manager is used to run like YARN or Apache Mesos on top of the external manager.

Related questions

0 votes
asked Aug 25, 2022 in Apache Spark by sharadyadav1986
0 votes
asked Mar 14, 2020 in Spark Sql by rajeshsharma
...