Jan 11 in Big Data | Hadoop
Q: What is the core concept behind Apache Hadoop framework?

1 Answer

Jan 11

Apache Hadoop is based on the concept of MapReduce algorithm. In MapReduce algorithm, Map and Reduce operations are used to process very large data set.

In this concept, Map method does the filtering and sorting of data. Reduce method performs the summarizing of data.


This is a concept from functional programming.

The key points in this concept are scalability and fault tolerance. In Apache Hadoop these features are achieved by multi-threading and efficient implementation of MapReduce.

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

Dec 20, 2019 in Agile
Feb 22 in Big Data | Hadoop
Jan 7 in Big Data | Hadoop