language-icon Old Web
English
Sign In

No free lunch theorem

In mathematical folklore, the 'no free lunch' (NFL) theorem (sometimes pluralized) of David Wolpert and William Macready appears in the 1997 'No Free Lunch Theorems for Optimization'. Wolpert had previously derived no free lunch theorems for machine learning (statistical inference).We have dubbed the associated results NFL theorems because they demonstrate that if an algorithm performs well on a certain class of problems then it necessarily pays for that with degraded performance on the set of all remaining problems. In mathematical folklore, the 'no free lunch' (NFL) theorem (sometimes pluralized) of David Wolpert and William Macready appears in the 1997 'No Free Lunch Theorems for Optimization'. Wolpert had previously derived no free lunch theorems for machine learning (statistical inference). In 2005, Wolpert and Macready themselves indicated that the first theorem in their paper 'state that any two optimization algorithms are equivalent when their performance is averaged across all possible problems'. The 1997 theorems of Wolpert and Macready are mathematically technical. The folkloric 'no free lunch' (NFL) theorem is an easily stated and easily understood consequence of theorems Wolpert and Macready actually prove. It is weaker than the proven theorems, and thus does not encapsulate them. Various investigators have extended the work of Wolpert and Macready substantively. See No free lunch in search and optimization for treatment of the research area. While some scholars argue that NFL conveys important insight, others argue that NFL is of little relevance to machine learning research.

[ "Evolutionary computation", "Optimization problem" ]
Parent Topic
Child Topic
    No Parent Topic