Answered step by step
Verified Expert Solution
Question
1 Approved Answer
using LinearAlgebra, Zygote, ForwardDiff, Printf using CSV , DataFrames using StatsBase: mean using Parameters using Distributions using Random using BSON ### Below you will implement
using LinearAlgebra, Zygote, ForwardDiff, Printf
using CSV DataFrames
using StatsBase: mean
using Parameters
using Distributions
using Random
using BSON
### Below you will implement your linear regression object from scratch.
###
### you will also imply some penalty methods as well.
#### ###
#### Before getting started you should write your studentnumber in integer format
const studentnumber::Int ## replace by your studentnumber
### ###
### This hw is a bit harder than the usual ones, so you may want to
### take a look at Julia official website
### Before you get started, you gotta look at do end syntax of Julia
### Instead of mean square loss we shall use Huber loss which
### sometimes performs better when your dataset contains some out of distribution points
### We split it into two parts the first one is for scalars the second one is for vectors
### remember that you will multiple dispatch!!!
function huberlossypred::T ytrue::T; delta ::Float where T : Real
return nothing
end
function huberlossypred::AbstractArrayT ytrue::AbstractArrayT where T : Real
return nothing
end
function unittesthuberloss::Bool
Random.seed!
@assert huberloss
@assert huberloss; delta
@assert isapproxhuberlossrandnrandn atol e
@assert isapproxhuberlossrandnrandn atol e
@info "You can not stop now comrade!!! jump to the next exercise!!!"
return
end
## See you have implemented huberloss well??
unittesthuberloss
### create a roof for the logistic regression LogisticClassifier
abstract type LinearRegression end
mutable struct linearregression : LinearRegression
## This part is given to you!!!
## Realize that we have fields: theta and bias.
theta ::AbstractVector
bias::Real
linearregressionn::Int newrandnn zero
end
### write the forwardpass function
function lr::linearregressionX::MatrixT where T:Real
## This dude is the forward pass function!!!
return nothing
end
function unittestforwardpass::Bool
try
linearregressionrandn
catch ERROR
errorSEGFAULT!!!!"
end
@info "Real test started!!!!"
for i in ProgressBar:
sleep
lr linearregression
x randn
@assert lrx xlrtheta lrbias
end
@info "Oki doki!!!"
return
end
### Let's give a try!!!!!!
unittestforwardpass
## we shall now implement fit! method!!!
## before we get ready run the next lines to see in this setting grad function returns a dictionary:
#
lr linearregression
val, grad Zygote.withgradientlr do lr
normlrtheta lrbias
end
#
function updategrads!lr::LinearRegression, learningrate::Float grads
## Here you will implement updategrads, this function returns nothing.
## Search for setfield! and getfield functions.
## x learningrategrads will happen here!!!
return nothing
end
function fit!lr::linearregression,
X::AbstractMatrix,
y::AbstractVector;
learningrate::Float
maxiter::Integer
lambda ::Float
##
return nothing
end
## Let's give a try!!!
lr linearregression
X randn
y randn
fit!lr X y; learningrate maxiter
### Things seem to work fine!!!
function unittestforfit
Random.seed!
lr linearregression
X randn
y randn
fit!lr X y; learningrate maxiter lambda
@assert normlrtheta lrbias "Your penalty method does not work!!!"
@assert meanlrX yYo do not fit perfectly!!!!"
@info "Okito dokito buddy!!!"
return
end
## Run next line to see what happens??? ##
unittestforfit
## ##
## No need to run below!!!
if abspathPROGRAMFILE @FILE
G::Int unittesthuberloss unittestforwardpass unittestforfit
dict DictstdIDstudentnumber, GG
try
BSON.@save $studentnumberres" dict
catch Exception
printlnsomething went wrong with", Exception
end
end
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started