Answered step by step
Verified Expert Solution
Question
1 Approved Answer
W(1) = [w1, w2, w3, w4, w5, w6] = [W7, W8, w9] W(2) [w1, w2, w3,w4, w5, w6] [W7, W8, w9] = Sigmoid (Z)
W(1) = [w1, w2, w3, w4, w5, w6] = [W7, W8, w9] W(2) [w1, w2, w3,w4, w5, w6] [W7, W8, w9] = Sigmoid (Z) Z=WT.X x1 W1 W7 x2 Input W9 W6 Output m 1 Error = m (Y; log( ;() + (1 - Y;)log (1 ;)] i=1 = 0: 5,0: 3,0:1, 0:05, 0:01 #23 -*- coding: utf-8 -*- # Assignment 2 # %% #23 Import required libararies import pandas as pd import numpy as np import scipy.special as sps # %% %23 Load data data = pd.read_csv('./data.csv') #23 covert dataframe to a numpy array data = np.array(data) #23 Input and output array X = data[:,:3] Y = data[:,3] #23 Initialize W. w = np.random.rand (3,) # Maximum number of iterations. max iter = 500 # define an error vector to save all error values over all iterations. error all = [ ] %23 Learning rate for gradient descent. eta = 0.5 #23 % % for iter in range (0, max_iter): Y hat = sps.expit (np.dot (X,w)) Compute the error below e 0 # Add this error to the end of error vector. # Gradient of the error = np.mean (np.multiply ((Y_hat Y), X.T), axis=1) grad_e w old = W W = W eta*grad_e print ('epoch (0:d), negative log-likelihood (1:.4f}, w={2}'.format(iter, e, w.T)) #23 % % # %23 %23 Plot error over iterations # X1 X2 4.87708 3.3077 1 6.29141 1.9718 1 5.23725 1.78808 1 5.87772 1.54805 1 6.57164 1.64352 1 4.39512 0.377521 1 O o O ooo 0 0 0 0 0 0 1.9346 -0.96441 1 0 5.79882 0.55621 1 0 2.71304 -0.95809 1 5.8401 0.71409 1 0 5.63386 0.448146 1 0 5.53636 2.99068 1 0 4.23068 0.020339 1 0 5.0816 0.183366 1 0 5.1552 1.1312 1 0 3.57477 -0.03439 1 0 4.50342 1.40246 1 0 6.39004 -1.51277 1 0 3.18431 -1.13677 1 0 7.10329 2.08823 1 0 5.26819 1.33738 1 0 5.66199 0.658664 1 0 6.49652 0.099152 1 0 3.45561 -2.21304 1 0 4.91517 2.10681 1 0 8.43576 2.93878 1 0 4.23147 1.3911 1 0 3.72672 -0.67339 1 4.70626 2.16434 1 0 4.25716 0.059306 1 0 4.22951 2.56752 1 0 5.82389 0.584974 1 5.61207 2.6482 1 0 4.00863 1.26945 1 0 3.48903 0.875659 1 0 6.68761 0.882557 1 0 6.65803 2.37183 1 3.19428 -0.0584 1 0 7.92116 3.55478 1 0 2.00277 -0.85992 1 1.63639 -3.26628 1 4.30995 2.55213 1 3.81584 0.142354 1 5.60707 2.41231 1 7.21626 2.87172 1 5.0215 1.33231 1 O o O o O o O 0 0 0 0 0 0 3.63406 0.42292 1 1.85637 -2.45424 1 5.87612 -0.32513 1 6.08557 1.66645 1 7.15541 1.04826 1 4.71966 0.925561 1 ooo ooo 0 0 0 0 3.77426 -0.00709 1 0 2.2795 0.15955 1 5.95627 0.619953 1 0 4.37587 2.25958 1 0 7.15078 3.12013 1 0 5.68533 -0.1409 1 0 4.14621 -0.17207 1 0 7.29926 3.18122 1 0 6.65995 1.11722 1 0 4.81433 1.79302 1 0 3.18518 0.023388 1 0 5.7304 5.96065 0.988353 -0.4332 1 0 1 0 4.63729 1.00734 4.21293 1 0 -0.495 1 0 6.821 3.11544 1 0 1.22634 -1.81171 1 0 5.83087 0.868184 1 5.84522 1.59523 1 3.20411 -0.87566 1 0 5.182 -0.95604 1 0 7.01564 1.00211 1 0 3.75126 1.67203 1 0 4.32044 2.03612 1 0 5.84578 1.28981 1 0 4.31732 0.2286 1 0 5.75119 0.480062 1 0 6.63863 1.31929 1 4.10019 0.767416 1 0 4.17129 -0.43329 1 0 4.08494 -0.16525 1 0 6.25737 2.70203 1 0 3.17821 -0.80572 1 0 5.23409 1.44152 1 0 3.77828 1.00464 3.78564 -1.81254 2.7482 -0.72157 4.66551 1.11505 1 1 1 1 7.56391 2.79803 1 6.53123 1.10983 1 7.01468 0.62075 1 O o O o O o O 0 0 0 0 0 3.08282 -0.47222 1 4.24092 -0.5903 1 4.50131 1.04061 1 5.41479 -1.05045 1 5.77109 -0.00326 1 o O o O o 0 0 0 0 6.7386 1.50557 1 0 6.03543 -0.00294 1 0 6.0143 0.096112 1 1 5.8483 -0.56707 1 1 9.1268 0.467921 1 1 6.50287 -0.32254 1 1 8.34785 -1.86196 1 1 8.7138 0.03187 1 1 7.48294 1.12607 1 1 5.32267 -0.06793 1 1 7.84906 0.173599 1 1 6.8107 0.340388 1 1 4.86189 -1.58112 1 1 7.81038 0.124991 1 1 5.69757 -2.66749 1 1 4.98258 -0.98596 1 1 7.29419 0.329265 1 1 5.30702 -0.53896 1 1 5.92084 -2.84293 1 1 6.67875 -0.31466 1 1 7.15065 -0.87198 1 1 7.05615 -0.65183 1 1 4.13707 -2.0626 1 1 9.23212 0.509982 1 1 6.42773 -2.29232 1 1 8.78533 -0.13043 1 1 5.81837 -1.4589 1 1 6.65273 0.634411 1 1 7.93596 -1.0498 1 1 5.31858 -0.76284 1 1 7.05392 -1.36317 1 1 4.87798 -1.59669 1 1 8.66012 1.09077 1 1 8.62422 1.31756 1 1 7.15783 0.591584 1 1 4.24049 -0.75785 1 1 7.51612 0.608586 1 1 8.05742 -1.0202 1 1 6.68742 -2.10663 1 1 6.48144 -0.58966 1 1 6.38024 -1.07908 1 1 6.57595 -0.8524 1 1 6.74217 0.405755 1 1 6.15184 -0.16589 1 1 5.87049 -0.0589 1 1 4.29903 -3.02396 1 1 6.31209 0.047785 1 1 6.8361 -0.86709 1 1 8.18026 0.392979 1 1 6.2743 -0.74506 1 1 7.56094 0.434863 1 1 6.76859 -1.29152 1 1 6.97468 -0.31525 1 1 6.78058 -0.80716 1 1 8.02122 1.64475 1 1 7.96289 0.690926 1 1 7.60384 -0.22106 1 1 4.79771 -1.87768 1 1 6.55515 -0.4466 1 1 3.31992 -1.92802 1 1 7.01145 -1.43609 1 1 8.32306 -0.83016 1 1 5.40471 -0.6547 1 1 5.36874 -1.05525 1 1 5.73118 0.217932 1 1 6.76585 1.1135 1 1 8.11251 0.978824 1 1 7.87284 0.285546 1 1 5.99627 -2.05124 1 1 8.14964 0.153367 1 1 5.76497 -0.51806 1 1 7.93855 2.00346 1 1 6.94061 -0.20413 1 1 5.62752 -0.37169 1 1 5.41718 -0.94306 1 1 6.87414 0.424736 1 1 3.93405 -2.89285 1 1 7.36817 -1.55564 1 1 6.99046 -1.73922 1 1 6.9594 -0.51321 1 1 7.55692 0.13305 1 1 9.95654 1.01262 1 1 7.61134 -0.98747 1 1 7.08595 -0.74341 6.84536 -1.83609 1 1 1 1 5.89508 -0.1925 1 1 6.07663 -1.15907 1 1 8.65917 -0.71532 1 1 7.71035 0.86847 1 1 6.47888 0.123053 1 1 6.40483 -0.35782 1 1 6.64506 -1.35758 1 1 6.70686 -1.00543 1 1 5.91611 -1.94202 1 1 5.23408 0.212685 1 1 9.17872 8.4101 1.24917 1.55816 1 1 1 1 6.29127 -0.55161 1 1 5.02663 -1.89957 1 1 6.34552 -1.49894 1 1 6.34255 -2.8653 1 1 6.67919 1.78889 1 1
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started