Question
Complete the TODO by finishing the GaussianNaiveBayes class. Refer to TODOs 4-13 to recall what needs to be passed to each function! TODOs for the
Complete the TODO by finishing the GaussianNaiveBayes class. Refer to TODOs 4-13 to recall what needs to be passed to each function!
TODOs for the fit() method
Compute the log priors using the compute_log_priors() function. Store the output into self.log_priors
Compute the normal priors by undoing the log that was applied to self.log_prior (we need the normal prior for computing probabilities). Store the output into self.priors.
Compute the the means and standard deviations for each class and feature using the compute_parameters(). Store the outputs into self.means and self.stds.
TODOs for the predict() method
Compute the log likelihoods for each class using the compute_log_likelihoods() function. Store the output into log_likelihoods.
Make the predictions for the passed data by using the compute_predictions() function.
Hint: Recall, that the fit() method defined our log priors self.log_priors and the class labels self.class_labels.
TODOs for the probabilities() method
Compute the log likelihoods for each class using the compute_log_likelihoods() function. Store the output into log_likelihoods.
Compute the likelihood by undoing the log that was applied to log_likelihoods (we need the normal likelihoods for computing probabilities). Store the output into likelihoods.
Compute the probabilities using the compute_probabilities() function. Store the output into probs.
this is the test code:
from sklearn.datasets import make_circles
def TEST_GaussianNaiveBayes():
def nonlinear_data(
n_samples: int = 100,
balance: List = None,
seed: int = 42
) -> List[np.ndarray]:
X, y = make_circles(random_state=seed, factor=.5, noise=.05)
return X, y
X, y = nonlinear_data()
class_names = {
0: 'C1',
1: 'C2',
}
gnb = GaussianNaiveBayes()
gnb.fit(X, y)
plot_decision_boundary(gnb, X, y, class_names)
todo_check([
(np.all(np.isclose(gnb.log_priors, np.array([-0.69314718, -0.69314718]), rtol=.1)), "gnb.log_priors has incorrect values"),
(np.all(np.isclose(gnb.priors, np.array([0.5, 0.5]), rtol=.1)), "gnb.priors has incorrect values"),
(np.all(np.isclose(gnb.means.flatten(), np.array([-0.00126599, -0.00426598, 0.00605525, 0.00029719]), rtol=.1)),"gnb.means has incorrect values"),
(np.all(np.isclose(gnb.probabilities(X)[1], np.array([0.4478195, 0.5521805]), rtol=.01)), "probabilities has incorrect values"),
(np.all(np.isclose(gnb.predict(X)[:3].flatten(), np.array([1, 1, 1]))), "wrong label predictions were detected.")
])
TEST_GaussianNaiveBayes()
garbage_collect(['TEST_GaussianNaiveBayes'])
class GaussianNaiveBayes():
def __init__(self):
pass
def fit(self, X, y):
self.class_labels = np.unique(y)
# TOOD 14.1
self.log_priors =
# TOOD 14.2
self.priors =
# TOOD 14.3
self.means, self.stds =
def predict(self, X):
# TOOD 14.4
log_likelihoods =
# TODO 14.5
y_hat =
return y_hat
def probabilities(self, X):
# TODO 14.6
log_likelihoods =
# TODO 14.7
likelihoods =
# TODO 14.8
probs =
return probs
with
TODO 14
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started