in Spark Sql by (23.9k points)
What are the core components in a distributed Spark application in Spark?

1 Answer

0 votes
by (32.2k points)
Executor: Spark’s tasks can be processed by the workers.

Driver: Running the main method() program to create RDDs. It performs the actions and transformations on them.

Cluster Manager: Launching the executors and drivers can be made with a Spark’s pluggable component that is called cluster manager. This cluster manager is used to run like YARN or Apache Mesos on top of the external manager.

Related questions

0 votes
0 votes
asked Mar 13, 2022 in PySpark by rajeshsharma (23.9k points)
+1 vote
asked Aug 5, 2020 in Hadoop by Hodge (2.6k points)
0 votes
asked Nov 21, 2022 in Azure Databricks by Robin (14.6k points)
...