Jan 11, 2020 in Big Data | Hadoop
Q: What is the core concept behind Apache Hadoop framework?

1 Answer

0 votes
Jan 11, 2020

Apache Hadoop is based on the concept of MapReduce algorithm. In MapReduce algorithm, Map and Reduce operations are used to process very large data set.

In this concept, Map method does the filtering and sorting of data. Reduce method performs the summarizing of data.

 

This is a concept from functional programming.

The key points in this concept are scalability and fault tolerance. In Apache Hadoop these features are achieved by multi-threading and efficient implementation of MapReduce.

Related questions

0 votes
Dec 20, 2019 in Agile
0 votes
Feb 22, 2020 in Big Data | Hadoop
...