in Big Data | Hadoop by
Q:

Explain what happens if during the PUT operation, HDFS block is assigned a replication factor 1 instead of the default value 3. 

1 Answer

0 votes
by

Replication factor is a property of HDFS that can be set accordingly for the entire cluster to adjust the number of times the blocks are to be replicated to ensure high data availability. For every block that is stored in HDFS, the cluster will have n-1 duplicated blocks. So, if the replication factor during the PUT operation is set to 1 instead of the default value 3, then it will have a single copy of data. Under these circumstances when the replication factor is set to 1 ,if the DataNode crashes under any circumstances, then only single copy of the data would be lost.

 

Implement Hadoop Job for Real-Time Querying

 

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

0 votes
asked Jan 11, 2020 in Big Data | Hadoop by rajeshsharma
0 votes
asked Nov 7, 2020 in HDFS by SakshiSharma
0 votes
asked Feb 23, 2020 in Big Data | Hadoop by rahuljain1
0 votes
asked Jan 13, 2020 in Big Data | Hadoop by AdilsonLima
0 votes
asked Jan 11, 2020 in Big Data | Hadoop by rajeshsharma
...