2.12 Bayesian bound. Let H be a countable hypothesis set of functions mapping X to f0; 1g...
Question:
2.12 Bayesian bound. Let H be a countable hypothesis set of functions mapping X to f0; 1g and let p be a probability measure over H. This probability measure represents the prior probability over the hypothesis class, i.e. the probability that a particular hypothesis is selected by the learning algorithm. Use Hoeding's inequality to show that for any > 0, with probability at least 1 ???? , the following inequality holds:
8h 2 H;R(h) bR S(h) +
s log 1 p(h) + log 1
2m
: (2.26)
Compare this result with the bound given in the inconsistent case for nite hypothesis sets (Hint: you could use 0 = p(h) as condence parameter in Hoeding's inequality).
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Foundations Of Machine Learning
ISBN: 9780262351362
2nd Edition
Authors: Mehryar Mohri, Afshin Rostamizadeh
Question Posted: