in Big Data | Hadoop by
What do you know about Block and Block scanner in HDFS?

1 Answer

0 votes

A large file in HDFS is broken into multiple parts and each part is stored on a different Block. By default a Block is of 64 MB capacity in HDFS.

Block Scanner is a program that every Data node in HDFS runs periodically to verify the checksum of every block stored on the data node.


The purpose of a Block Scanner is to detect any data corruption errors on Data node.

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

0 votes
asked Mar 25, 2020 in SAP by Hodge
0 votes
asked Apr 15, 2020 in Robotic Process Automation by SakshiSharma
0 votes
asked Nov 26, 2019 in React JS by AdilsonLima
0 votes
asked Sep 17, 2020 in Agile by SakshiSharma
0 votes
0 votes
asked Mar 8, 2020 in Git Slack Integration by Hodge