Categories

Jan 11 in Big Data | Hadoop

Q: How will you create a custom Partitioner in a Hadoop job?

1 Answer

Jan 11

Partition phase runs between Map and Reduce phase. It is an optional phase. We can create a custom partitioner by extending the org.apache.hadoop.mapreduce.Partitio class in Hadoop. In this class, we have to override getPartition(KEY key, VALUE value, int numPartitions) method.

 

This method takes three inputs. In this method, numPartitions is same as the number of reducers in our job. We pass key and value to get the partition number to which this key,value record will be assigned. There will be a reducer corresponding to that partition. The reducer will further handle to summarizing of the data.

Once custom Partitioner class is ready, we have to set it in the Hadoop job. We can use following method to set it:

job.setPartitionerClass(CustomPartitione

Click here to read more about Loan/Mortgage
Click here to read more about Insurance

Related questions

Madanswer
Jan 26 in Big Data | Hadoop
Jun 30 in Azure
Jun 1 in Bootstrap
...