Mar 14, 2020 in Spark Sql
Q: What are the functions of SparkCore in Spark?

1 Answer

0 votes
Mar 14, 2020
Spark core is nothing but a space engine for distributed data processing and large-scale parallel process. The Spark core is also known as the distributed execution engine while the Python APIs, Java, and Scala offer ETL application platform. Spark core can perform different functions like monitoring jobs, storage system interactions, job scheduling, memory management, and fault-tolerance. Further, it can allow workload streaming, machine learning, and SQL.

The Spark core can also be responsible for:

Monitoring, Scheduling, and distributing jobs on a cluster

Fault recovery and memory management

Ecosystems interactions
Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

0 votes
Mar 14, 2020 in Spark Sql
...