8.7 Second-order regret bound. Consider the randomized algorithm that di ers from the RWMalgorithm only by the

Question:

8.7 Second-order regret bound. Consider the randomized algorithm that di ers from the RWMalgorithm only by the weight update, i.e., wt+1;i (1????(1???? )lt;i)wt;i, t 2 [T], which is applied to all i 2 [N] with 1=2  < 1. This algorithm can be used in a more general setting than RWM since the losses lt;i are only assumed to be in [0; 1]. The objective of this problem is to show that a similar upper bound can be shown for the regret.

(a) Use the same potential Wt as for the RWM algorithm and derive a simple upper bound for logWT+1:

logWT+1  logN ???? (1 ???? )LT :

(Hint: Use the inequality log(1 ???? x)  ????x for x 2 [0; 1=2].)

(b) Prove the following lower bound for the potential for all i 2 [N]:

logWT+1  ????(1 ???? )LT;i ???? (1 ???? )2 XT t=1 l2 t;i :

(Hint: Use the inequality log(1 ???? x)  ????x ???? x2, which is valid for all x 2

[0; 1=2].)

(c) Use upper and lower bounds to derive the following regret bound for the algorithm: RT  2pT logN.

Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Foundations Of Machine Learning

ISBN: 9780262351362

2nd Edition

Authors: Mehryar Mohri, Afshin Rostamizadeh

Question Posted: