Mar 14, 2020 in Spark Sql
Q: How can we use Spark as the alongside Hadoop in Spark?

1 Answer

0 votes
Mar 14, 2020

Apache Spark has the capability to compatible with Hadoop. It leads to technologies as very powerful combinations.

The components of Hadoop can be used alongside spark in different ways.

They are:

MapReduce: In the same Hadoop cluster the Spark can be used with MapReduce as a framework processing.

Real-Time & Batch Processing: Both MapReduce & Spark can be used together. Of those Spark is used for real-time processing and MapReduce is used for batch processing.

HDFS: The Spark ability is to run on HDFS top to level the replicated storage.

YARN: YARN is termed as the next generation of Hadoop in which we can run the Spark applications.

Related questions

0 votes
Mar 14, 2020 in Spark Sql
0 votes
Mar 7, 2020 in Spark Sql
...