Answered step by step
Verified Expert Solution
Link Copied!

Question

00
1 Approved Answer

6. Kernel Functions (50 pts, page 8 & 9) (a) [10 pts, page 8] If k1(x, ) and k2(x, z) are valid kernels and a,

image text in transcribed

6. Kernel Functions (50 pts, page 8 & 9) (a) [10 pts, page 8] If k1(x, ) and k2(x, z) are valid kernels and a, > 0, show that k3(x,z) = aki(x, z) + Bk2(x, z) is also a valid kernel. (b) [10 pts, page 8] Given x R2 and z R2, prove that k(x, z) = (x.z)3 + 17(x-2) is a valid kernel. (c) [10 pts, page 8] Consider the m-of-n function, where x E {0,1}" is an n-dimensional Boolean vector and f(x) = 1 if and only if at least m values of x = 1. We can write down a linear threshold function of the form f(x) = sgn(w. 1.X - ). Find w and 0. (d) [20 pts, page 9] A support vector machine for a binary classification problem makes predic- tions by learning y(x) = wx + b (3) where x E RM is the input feature vector, w, b are the parameters of the linear deci- sion boundary (y(x) = 0) and the classification decision is made using sign(y(x)). The training data consists of N pairs {Xn, tn} of inputs and their corresponding labels, where tn E {-1, +1}. For a linearly separable dataset, the parameters can be learned by solving the following constrained minimization problem: arg min - ||w|12, (4) w,6 s.t. tn(wxn+b) > 1, n= 1, ..., N. (5) i. Plot a linearly separable dataset (E R2 ) and a learned decision boundary that uses a linear kernel. Draw the margin, and circle the support vectors. ii. Plot a linearly inseparable dataset (E R2 ) and a learned decision boundary that uses a Gaussian (RBF) kernel. Draw the margin, and circle the support vectors. 6. Kernel Functions (50 pts, page 8 & 9) (a) [10 pts, page 8] If k1(x, ) and k2(x, z) are valid kernels and a, > 0, show that k3(x,z) = aki(x, z) + Bk2(x, z) is also a valid kernel. (b) [10 pts, page 8] Given x R2 and z R2, prove that k(x, z) = (x.z)3 + 17(x-2) is a valid kernel. (c) [10 pts, page 8] Consider the m-of-n function, where x E {0,1}" is an n-dimensional Boolean vector and f(x) = 1 if and only if at least m values of x = 1. We can write down a linear threshold function of the form f(x) = sgn(w. 1.X - ). Find w and 0. (d) [20 pts, page 9] A support vector machine for a binary classification problem makes predic- tions by learning y(x) = wx + b (3) where x E RM is the input feature vector, w, b are the parameters of the linear deci- sion boundary (y(x) = 0) and the classification decision is made using sign(y(x)). The training data consists of N pairs {Xn, tn} of inputs and their corresponding labels, where tn E {-1, +1}. For a linearly separable dataset, the parameters can be learned by solving the following constrained minimization problem: arg min - ||w|12, (4) w,6 s.t. tn(wxn+b) > 1, n= 1, ..., N. (5) i. Plot a linearly separable dataset (E R2 ) and a learned decision boundary that uses a linear kernel. Draw the margin, and circle the support vectors. ii. Plot a linearly inseparable dataset (E R2 ) and a learned decision boundary that uses a Gaussian (RBF) kernel. Draw the margin, and circle the support vectors

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access with AI-Powered Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Students also viewed these Accounting questions