in Big Data | Hadoop by
What are the important configuration files in Hadoop?

1 Answer

0 votes

There are two important configuration files in a Hadoop cluster:

1. Default Configuration: There are core-default.xml, hdfs-default.xml and mapred-default.xml files in which we specify the default configuration for Hadoop cluster. These are read only files.


2. Custom Configuration: We have site-specific custom files like core-site.xml, hdfs-site.xml, mapred-site.xml in which we can specify the site-specific configuration.

All the Jobs in Hadoop and HDFS implementation uses the parameters defined in the above-mentioned files. With customization we can tune these processes according to our use case.

In Hadoop API, there is a Configuration class that loads these files and provides the values at run time to different jobs.

Related questions

0 votes
asked Jan 26, 2020 in Big Data | Hadoop by rajeshsharma
0 votes
asked Feb 23, 2020 in Big Data | Hadoop by rahuljain1
+1 vote
asked Feb 23, 2020 in Big Data | Hadoop by rahuljain1