Yes. It is used for the following:
Spark and other big data tools use MapReduce, a way of doing things. So it's essential to learn the MapReduce model and how to turn a problem into a series of MR tasks.
The Hadoop Map-Reduce model is critical when data grows beyond what can fit in the cluster memory.
Almost every other tool, like Hive or Pig, changes the query into a series of MapReduce steps. If you understand MapReduce, you'll be able to make better queries.