Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Complete the TODO by finishing the GaussianNaiveBayes class. Refer to TODOs 4-13 to recall what needs to be passed to each function! TODOs for the

  

Complete the TODO by finishing the GaussianNaiveBayes class. Refer to TODOs 4-13 to recall what needs to be passed to each function!

TODOs for the fit() method

Compute the log priors using the compute_log_priors() function. Store the output into self.log_priors

Compute the normal priors by undoing the log that was applied to self.log_prior (we need the normal prior for computing probabilities). Store the output into self.priors.

Compute the the means and standard deviations for each class and feature using the compute_parameters(). Store the outputs into self.means and self.stds.

TODOs for the predict() method

Compute the log likelihoods for each class using the compute_log_likelihoods() function. Store the output into log_likelihoods.

Make the predictions for the passed data by using the compute_predictions() function.

Hint: Recall, that the fit() method defined our log priors self.log_priors and the class labels self.class_labels.

TODOs for the probabilities() method

Compute the log likelihoods for each class using the compute_log_likelihoods() function. Store the output into log_likelihoods.

Compute the likelihood by undoing the log that was applied to log_likelihoods (we need the normal likelihoods for computing probabilities). Store the output into likelihoods.

Compute the probabilities using the compute_probabilities() function. Store the output into probs.

this is the test code:

from sklearn.datasets import make_circles

def TEST_GaussianNaiveBayes():
def nonlinear_data(
n_samples: int = 100,
balance: List = None,
seed: int = 42
) -> List[np.ndarray]:
X, y = make_circles(random_state=seed, factor=.5, noise=.05)

return X, y

X, y = nonlinear_data()
class_names = {
0: 'C1',
1: 'C2',
}
gnb = GaussianNaiveBayes()
gnb.fit(X, y)
plot_decision_boundary(gnb, X, y, class_names)

todo_check([
(np.all(np.isclose(gnb.log_priors, np.array([-0.69314718, -0.69314718]), rtol=.1)), "gnb.log_priors has incorrect values"),
(np.all(np.isclose(gnb.priors, np.array([0.5, 0.5]), rtol=.1)), "gnb.priors has incorrect values"),
(np.all(np.isclose(gnb.means.flatten(), np.array([-0.00126599, -0.00426598, 0.00605525, 0.00029719]), rtol=.1)),"gnb.means has incorrect values"),
(np.all(np.isclose(gnb.probabilities(X)[1], np.array([0.4478195, 0.5521805]), rtol=.01)), "probabilities has incorrect values"),
(np.all(np.isclose(gnb.predict(X)[:3].flatten(), np.array([1, 1, 1]))), "wrong label predictions were detected.")
])

TEST_GaussianNaiveBayes()
garbage_collect(['TEST_GaussianNaiveBayes'])

class GaussianNaiveBayes():
def __init__(self):
pass

def fit(self, X, y):
self.class_labels = np.unique(y)

# TOOD 14.1
self.log_priors =
# TOOD 14.2
self.priors =
# TOOD 14.3
self.means, self.stds =

def predict(self, X):
# TOOD 14.4
log_likelihoods =

# TODO 14.5
y_hat =

return y_hat

def probabilities(self, X):
# TODO 14.6
log_likelihoods =

# TODO 14.7
likelihoods =

# TODO 14.8
probs =

return probs

with

TODO 14

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access to Expert-Tailored Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image_2

Step: 3

blur-text-image_3

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Recommended Textbook for

Fundamentals of Financial Management

Authors: Eugene F. Brigham, Joel F. Houston

Concise 6th Edition

324664559, 978-0324664553

More Books

Students also viewed these Finance questions

Question

Discuss the roles of metacognition in learning and remembering.

Answered: 1 week ago

Question

What are three disadvantages of using the direct write-off method?

Answered: 1 week ago