0 votes
in Big Data | Hadoop by

What are Problems with small files and HDFS?

1 Answer

0 votes
by

HDFS is not good at handling large number of small files. Because every file, directory and block in HDFS is represented as an object in the namenode’s memory, each of which occupies approx 150 bytes So 10 million files, each using a block, would use about 3 gigabytes of memory. when we go for a billion files the memory requirement in namenode cannot be met.

Related questions

+1 vote
asked Feb 23, 2020 in Big Data | Hadoop by rahuljain1
0 votes
asked Feb 23, 2020 in Big Data | Hadoop by rahuljain1
...