HyperAIHyperAI

Command Palette

Search for a command to run...

Bias-variance Decomposition

Date

6 years ago

Bias-variance decomposition is a tool for explaining the generalization performance of a learning algorithm from the perspective of bias and variance. It is defined as follows:

Assume that there are K data sets, each of which is independently extracted from a distribution p(t,x) (t represents the variable to be predicted, and x represents the feature variable).

Different models can be obtained by training on different data sets. The performance of the learning algorithm is measured by the average performance of the K models trained on these K data sets, that is:

Here h(x) represents the true function that generates the data, that is, t=h(x).

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Bias-variance Decomposition | Wiki | HyperAI