Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

HW 5 Theory + SVM PAC Learning and VC dimension ( 3 0 pts ) Let X = R ^ 2 . Let C =

HW5 Theory + SVM
PAC Learning and VC dimension (30 pts)
Let X=R^2. Let
C=H={h(r_1,r_2)={(x_1,x_2)(x_1^2+x_2^2>=r_1@x_1^2+x_2^2<=r_2)}}, for 0<=r_1<=r_2,
the set of all origin-centered rings.
(8 pts) What is the VC(H)? Prove your answer.
(14 pts) Describe a polynomial sample complexity algorithm L that learns C using H. State the time complexity and the sample complexity of your suggested algorithm. Prove all your steps.
In class we saw a bound on the sample complexity when H is finite.
m>=1/\epsi (ln|H|+ln1/\delta )
When |H| is infinite, we have a different bound:
m>=1/\epsi (4 log_22/\delta +8VC(H) log_213/\epsi )
(8 pts) You want to get with 95% confidence a hypothesis with at most 5% error. Calculate the sample complexity with the bound that you found in b and the above bound for infinite |H|. In which one did you get a smaller m? Explain.

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Relational Database Design With Microcomputer Applications

Authors: Glenn A. Jackson

1st Edition

0137718411, 978-0137718412

More Books

Students also viewed these Databases questions