Jan 11, 2020 in Big Data | Hadoop
Q: What are the important configuration files in Hadoop?

1 Answer

0 votes
Jan 11, 2020

There are two important configuration files in a Hadoop cluster:

1. Default Configuration: There are core-default.xml, hdfs-default.xml and mapred-default.xml files in which we specify the default configuration for Hadoop cluster. These are read only files.

 

2. Custom Configuration: We have site-specific custom files like core-site.xml, hdfs-site.xml, mapred-site.xml in which we can specify the site-specific configuration.

All the Jobs in Hadoop and HDFS implementation uses the parameters defined in the above-mentioned files. With customization we can tune these processes according to our use case.

In Hadoop API, there is a Configuration class that loads these files and provides the values at run time to different jobs.

Related questions

0 votes
Jan 11, 2020 in Big Data | Hadoop
0 votes
Mar 3, 2020 in DevOps
...