Feb 4 in Artificial Intelligence
Q: What is the Trade-off Between Bias and Variance?

1 Answer

0 votes
Feb 4

The bias-variance decomposition essentially decomposes the learning error from any algorithm by adding the bias, variance, and a bit of irreducible error due to noise in the underlying dataset. 

Necessarily, if you make the model more complex and add more variables, you’ll lose bias but gain variance. To get the optimally-reduced amount of error, you’ll have to trade off bias and variance. Neither high bias nor high variance is desired.

High bias and low variance algorithms train models that are consistent, but inaccurate on average.

High variance and low bias algorithms train models that are accurate but inconsistent. 

Click here to read more about Artificial Intelligence
Click here to read more about Insurance

Related questions

0 votes
Nov 28, 2019 in Machine Learning
0 votes
Sep 5, 2020 in Git
...