Categories

Jan 26 in Big Data | Hadoop
Q:

Explain what happens if during the PUT operation, HDFS block is assigned a replication factor 1 instead of the default value 3. 

1 Answer

Jan 26

Replication factor is a property of HDFS that can be set accordingly for the entire cluster to adjust the number of times the blocks are to be replicated to ensure high data availability. For every block that is stored in HDFS, the cluster will have n-1 duplicated blocks. So, if the replication factor during the PUT operation is set to 1 instead of the default value 3, then it will have a single copy of data. Under these circumstances when the replication factor is set to 1 ,if the DataNode crashes under any circumstances, then only single copy of the data would be lost.

 

Implement Hadoop Job for Real-Time Querying

 

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

Madanswer
Jan 11 in Big Data | Hadoop
Nov 7 in HDFS
Sep 8 in Service Now
...