0 votes
in Big Data | Hadoop by

On what concept the Hadoop framework works? 

1 Answer

0 votes
by

 

Hadoop Framework works on the following two core components-

 

1)HDFS – Hadoop Distributed File System is the java based file system for scalable and reliable storage of large datasets. Data in HDFS is stored in the form of blocks and it operates on the Master Slave Architecture.

 

2)Hadoop MapReduce-This is a java based programming paradigm of Hadoop framework that provides scalability across various Hadoop clusters. MapReduce distributes the workload into various tasks that can run in parallel. Hadoop jobs perform 2 separate tasks- job. The map job breaks down the data sets into key-value pairs or tuples. The reduce job then takes the output of the map job and combines the data tuples to into smaller set of tuples. The reduce job is always performed after the map job is executed.

 

spark interview questions and answers

 

Related questions

+1 vote
asked Jan 13, 2020 in Big Data | Hadoop by AdilsonLima
0 votes
asked Mar 27, 2020 in Big Data | Hadoop by AdilsonLima
...