in Artificial Intelligence by
Q:
What do you understand by the reward maximization?

1 Answer

0 votes
by

1) Reward maximization term is used in reinforcement learning, and which is a goal of the reinforcement learning agent. 

2) In RL, a reward is a positive feedback by taking action for a transition from one state to another. 

3) If the agent performs a good action by applying optimal policies, he gets a reward, and if he performs a bad action, one reward is subtracted. 

4) The goal of the agent is to maximize these rewards by applying optimal policies, which is termed as reward maximization.

Click here to read more about Artificial Intelligence
Click here to read more about Insurance

Related questions

0 votes
asked Sep 3, 2020 in Git by SakshiSharma
0 votes
asked Dec 24, 2019 in Data Science by sharadyadav1986
+1 vote
asked Feb 17 in Linux by SakshiSharma
0 votes
asked Sep 7, 2020 in Service Now by Hodge
0 votes
asked May 15, 2020 in DBMS by sharadyadav1986
...