Jan 11 in Big Data | Hadoop
Q: What are the important configuration files in Hadoop?

1 Answer

Jan 11

There are two important configuration files in a Hadoop cluster:

1. Default Configuration: There are core-default.xml, hdfs-default.xml and mapred-default.xml files in which we specify the default configuration for Hadoop cluster. These are read only files.


2. Custom Configuration: We have site-specific custom files like core-site.xml, hdfs-site.xml, mapred-site.xml in which we can specify the site-specific configuration.

All the Jobs in Hadoop and HDFS implementation uses the parameters defined in the above-mentioned files. With customization we can tune these processes according to our use case.

In Hadoop API, there is a Configuration class that loads these files and provides the values at run time to different jobs.

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

Jan 11 in Big Data | Hadoop
Mar 3 in DevOps
Jul 27 in Spring