0 votes
in Spark Sql by
How can we use Spark as the alongside Hadoop in Spark?

1 Answer

0 votes
by

Apache Spark has the capability to compatible with Hadoop. It leads to technologies as very powerful combinations.

The components of Hadoop can be used alongside spark in different ways.

They are:

MapReduce: In the same Hadoop cluster the Spark can be used with MapReduce as a framework processing.

Real-Time & Batch Processing: Both MapReduce & Spark can be used together. Of those Spark is used for real-time processing and MapReduce is used for batch processing.

HDFS: The Spark ability is to run on HDFS top to level the replicated storage.

YARN: YARN is termed as the next generation of Hadoop in which we can run the Spark applications.

Related questions

0 votes
asked Mar 14, 2020 in Spark Sql by rajeshsharma
0 votes
asked Jan 31, 2022 in Azure Data Lake Storage by sharadyadav1986
...