The marginal likelihood of a model is one the key quantities appearing throughout machine learning and statistics, since it provides an intuitive and natural objective function for model selection and parameter estimation. I recently read a new paper by Sumio Watanabe on A Widely applicable Bayesian information criterion (WBIC)[cite key="watanabe2012widely"] (and to appear in JMLR soon) that provides a new, theoretically grounded and easy to implement method of approximating the marginal likelihood, which I will briefly describe in this post. I'll summarise some of the important aspects of the marginal likelihood and then briefly describe the WBIC and some thoughts and questions on its use.
Hello blogging world! All the pieces in this blog will be ways for me to explore my current understanding of concepts and topics in Machine Learning. Putting thoughts into writing really does force us to think clearly about what we think and why, and I put this online in the hope that some of my explorations and thought experiments might be useful to others, and that I might learn other perspectives from those that read it and share their feedback. Plus, I do enjoy writing, so maintaining a blog seems to be a good way to keep writing regularly. Continue reading Hello World!