Mar 14, 2020 in Spark Sql
Q: Define Worker node in Spark?

1 Answer

0 votes
Mar 14, 2020

The node which can able to run the application even in the cluster is known as a Worker node. The driver program must accept incoming connections and listen from its executors which can be addressable. Worker nodes can be referred to the slave nodes because the worker node’s work can be assigned by the master node. The processing of data can be done by worker node and it reports to the master node. The master node always schedules the tasks.

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

0 votes
Mar 14, 2020 in Spark Sql
...