0 votes
in Spark Sql by
Define Worker node in Spark?

1 Answer

0 votes
by

The node which can able to run the application even in the cluster is known as a Worker node. The driver program must accept incoming connections and listen from its executors which can be addressable. Worker nodes can be referred to the slave nodes because the worker node’s work can be assigned by the master node. The processing of data can be done by worker node and it reports to the master node. The master node always schedules the tasks.

Related questions

+1 vote
asked Jan 13, 2020 in Big Data | Hadoop by AdilsonLima
0 votes
asked Aug 18, 2023 in Spark Sql by GeorgeBell
...