Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Q1: Consider the same data we used in our lecture. Answer the following questions: X1 0 X2 Homework 4 September 30, 2023 X3 X4

Q1: Consider the same data we used in our lecture. Answer the following questions: X1 0 X2 Homework 4Artificial Neural Networks (ANN) X1 X2 X3Y 1 0 1 0 1 1 1 1 0 0 0 0 0 1 1 0 0 1 0 1 1 -1 1 1 1 -1 0 -1 1 1 0Summary of Our Previous Lecture Input nodes X X- X3- Black box W W2 W3 Output node  Y y = sign(w x + wo) 1 0Summary of Our Previous Lecture Input nodes X X X3- Black box W W2 W3 Output node (2) Y y = sign (wx + wo) DX1 0 1 1 0 X2 0 1 0 1 Y -1 1 1 -1 e =^[yi-f(w(k),x;)] 2 w(k+1) = w(k) + ex ; Example sign(x)={ : learningX1 1 0 0 1 1 X2 0 1 0  1 1 1 1 Adding XO Example Y -1 1 1 -1 f(w,x) -1 -1 -1 -1 e =^[yi-f(w(k),xi)] 2 w(k+1)X1 0 0 X2 X0 1 1 1 1 0 1 0 1 1 1 Adding XO Example Y -1 1 1 -1 f(w,x) -1 -1 -1 -1 A[yi-f(w(k),x;)] 2 w(k+1) =X1 0 1 1 0 X2 Y 0 -1 1 1 0 1 1 -1 f(w,x) -1 -1 (conflict) -1 -1 [yi(w(k),x;)] e= 2 w(k+1) = w(k) + ex; ;X1 0 1 X2 0 0 1 1 1 0  1 w (k+1) 1 1 1 Y -1 1 1 -1 ^[Yi-JW' ,xi)] 2 f(w,x) 1 1 Example 1(conflict) f(w(k), x)X1 0 0 1 1 1 0 X2  e 1 0 1 1 ^|Yi-J 1 1 Y -1 1 1 -1 f(w,x) -1 1 1 xi)] 2 w(k+1) = w(k) + ex ; Example 1X1 0 1 1 0 e X2  0 1 1 1 0 1 1 1 Y w(k+1) = -1 1 1 -1 f(w,x) -1 -1(conflict) -1 Example -1 ^[YiJW`Xi)] 2 w(k)X1 0 0 1 1 1 0 e X2  Y f(w,x) 1 1 0 1 1 1 -1 1 1 -1 ^|Yi-J ^ [YiJ (W,Xi)] -1 1 1 Example 1(conflict) 2 w(k+1)X1 X2 Y 0 0 -1 -1 1 1 1 1 1 0 0 1 f(w,x) 1 -1 1 -1 e = [yi(w(k),xi)] 2 w(k+1) = w(k) + ex; ; Example 1

Q1: Consider the same data we used in our lecture. Answer the following questions: X1 0 X2 Homework 4 September 30, 2023 X3 X4 1 1 y 0 class 0 -1 1 1 -1 0 1 1 Using the gradient descend format we discussed in Week 5 lecture (see slides) to train a perceptron model y = sign(wx+wo) with X = 0.2 and w = [-0.5, -0.5, 0.5] == X 1 1 1 1 0 0 0 0 Artificial Neural Networks (ANN) X2 X3 Y 0 -1 1 1 0 1 1 -1 0 0 1 1 0 1 1 0 1 1 0 -1 1 1 0 -1 Input X X X3- Black box Output Y Output Y is 1 if at least two of the three inputs are equal to 1. Summary of Our Previous Lecture Input nodes X X- X3- Black box W W2 W3 Output node Y y = sign(wx + wo) 1 0 -1 -1 Sign function 1 3 Summary of Our Previous Lecture Input nodes X X X3- Black box W W2 W3 Output node (2) Y y = sign (wx + wo) D x 1 vector D x 1 vector 0 -1 -1 Sign function 0 4 X1 0 1 1 0 X2 0 1 0 1 Y -1 1 1 -1 e =^[yi-f(w(k),x;)] 2 w(k+1) = w(k) + ex ; Example sign(x)={ : learning rate 1 -1 x > 0 x 0 f(wk),x) = sign(w(k) T w(0) = [0,0,0] xi + w(k)) X1 1 0 0 1 1 X2 0 1 0 1 1 1 1 Adding XO Example Y -1 1 1 -1 f(w,x) -1 -1 -1 -1 e =^[yi-f(w(k),xi)] 2 w(k+1) = w(k) + ex ; 1 sign(x)={ -1 : learning rate x > 0 x 0 < f(w(K), x) = sign(w(k)T; xi +w(k)) w(0) = [0,0,0] X1 0 0 X2 X0 1 1 1 1 0 1 0 1 1 1 Adding XO Example Y -1 1 1 -1 f(w,x) -1 -1 -1 -1 A[yi-f(w(k),x;)] 2 w(k+1) = w(k) + ex; ; 1 sign(x)={ -1 : learning rate f(w(k), x) = sign(w(K)T xi + w(k)) Xi x > 0 x 0 w (0) = = [0,0,0] X1 0 1 1 0 X2 Y 0 -1 1 1 0 1 1 -1 f(w,x) -1 -1 (conflict) -1 -1 [yi(w(k),x;)] e= 2 w(k+1) = w(k) + ex ; Example w (1) sign(x)={ : learning rate 1 -1 f(wk),x) = sign(w(k)" x w (0) = [0,0,0] x > 0 x 0 xi + w (k)) 10.21 -8+08-6 +0.2 1 0.2 L0.2J X1 0 1 X2 0 0 1 1 1 0 1 w (k+1) 1 1 1 Y -1 1 1 -1 ^[Yi-JW ,Xi)] f(w,x) 1 1 Example 1(conflict) f(w(k), x) = sign(w(k) * x +w(K)) w() = [0.2, 0.2,0.2] 1 2 w(k) + exi ; w (2) : learning rate 1 sign(x)={_ -1 [0.2] 3-08-6 0.2 0 = x > 0 x 0 = 0.2 -0.2 L0.2J [0.2] 0.2 X1 0 0 1 1 1 0 X2 e 1 0 1 1 ^|Yi-J 1 1 Y -1 1 1 -1 f(w,x) -1 1 1 xi)] 2 w(k+1) = w(k) + ex ; Example 1 (conflict) w (3) sign(x)={ : learning rate 1 -1 f(w(k),x;) = sign(w(k)T, w() = [0.2, 0.2,0] x > 0 x 0 10.21 0.2 - 0.2 1 0 .1 xi + w(K)) 0.2 0 -0.2. X1 0 1 1 0 e X2 0 1 1 1 0 1 1 1 Y w(k+1) = -1 1 1 -1 f(w,x) -1 -1(conflict) -1 Example -1 ^[YiJ\W\Xi)] 2 w(k) = w(k) + ex ; w(4) 1 sign(x)={_ 1 f(w(k), xi) = sign(w(k) T x + w(K)) w () = [0.2, 0, 0.2] : learning rate = x > 0 x 0 0.2 []+ 0 -0.2] +0.2 1 = 0.41 0.2 0 X1 0 0 1 1 1 0 e X2 Y f(w,x) 1 1 0 1 1 1 -1 1 1 -1 ^|Yi-J ^ [YiJ (W,Xi)] -1 1 1 Example 1(conflict) 2 w(k+1) = w(k) + ex ; w (5) : learning rate 1 sign(x)={ -1 T f(w (k), x) = sign(w(k) + x + w(K)) w(4) = [0.4, 0.2,0] = x > 0 x 0 [0.4 0.2 0 0.4 H-] = 0 -0.2. 0.2 1 X1 X2 Y 0 0 -1 -1 1 1 1 1 1 0 0 1 f(w,x) 1 -1 1 -1 e = [yi(w(k),xi)] 2 w(k+1) = w(k) + ex; ; Example 1 sign(x)={ -1 : learning rate f(w(k),x;) = sign(w(k), w (5) = x > 0 x 0 Done! xi + w(k)) [0.4, 0, -0.2]

Step by Step Solution

3.38 Rating (170 Votes )

There are 3 Steps involved in it

Step: 1

Based on the information provided it appears that you are attempting to train a perceptron mo... blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Applied Regression Analysis And Other Multivariable Methods

Authors: David G. Kleinbaum, Lawrence L. Kupper, Azhar Nizam, Eli S. Rosenberg

5th Edition

1285051084, 978-1285963754, 128596375X, 978-1285051086

Students also viewed these Programming questions

Question

Triple point of water is 0 F 4 9 2 R 0 K - 2 7 3 C

Answered: 1 week ago