in Spark Sql by
What are the functions of SparkCore in Spark?

► Click here to show 1 Answer

0 votes
Spark core is nothing but a space engine for distributed data processing and large-scale parallel process. The Spark core is also known as the distributed execution engine while the Python APIs, Java, and Scala offer ETL application platform. Spark core can perform different functions like monitoring jobs, storage system interactions, job scheduling, memory management, and fault-tolerance. Further, it can allow workload streaming, machine learning, and SQL.

The Spark core can also be responsible for:

Monitoring, Scheduling, and distributing jobs on a cluster

Fault recovery and memory management

Ecosystems interactions
Learn More with Madanswer

Related questions

0 votes
asked Mar 14, 2020 in Spark Sql by rajeshsharma