Occam's Razor in Algorithmic Information Theory - a podcast by MCMP Team

from 2015-02-20T01:48:56

:: ::

Tom Sterkenburg (Amsterdam/Groningen) gives a talk at the MCMP Colloquium (15 January, 2015) titled "Occam's Razor in Algorithmic Information Theory". Abstract: Algorithmic information theory, also known as Kolmogorov complexity, is sometimes believed to offer us a general and objective measure of simplicity. The first variant of this simplicity measure to appear in the literature was in fact part of a theory of prediction: the central achievement of its originator, R.J. Solomonoff, was the definition of an idealized method of prediction that is taken to implement Occam's razor in giving greater probability to simpler hypotheses about the future. Moreover, in many writings on the subject an argument of the following sort takes shape. From (1) the definition of the Solomonoff predictor which has a precise preference for simplicity, and (2) a formal proof that this predictor will generally lead us to the truth, it follows that (Occam's razor) a preference for simplicity will generally lead us to the truth. Thus, sensationally, this is an argument to justify Occam's razor. In this talk, I show why the argument fails. The key to its dissolution is a representation theorem that links Kolmogorov complexity to Bayesian prediction.

Further episodes of MCMP – Philosophy of Science

Further podcasts by MCMP Team

Website of MCMP Team