0 votes
in PySpark by
What are the main functions of Spark core?

1 Answer

0 votes
by

The main task of Spark Core is to implement several vital functions such as memory management, fault-tolerance, monitoring jobs, job setting up, and communication with storage systems. It also contains additional libraries, built atop the middle that is used to diverse workloads for streaming, machine learning, and SQL.

The Spark Core is mainly used for the following tasks:

  1. Fault tolerance and recovery.
  2. To interact with storage systems.
  3. Memory management.
  4. Scheduling and monitoring jobs on a cluster.

Related questions

0 votes
asked Mar 13, 2022 in PySpark by rajeshsharma
0 votes
asked Mar 13, 2022 in PySpark by rajeshsharma
...