Question
Hi, I need help for the below question, I didn't pass this question after my submission. Please help me for the coding. Thank you so
Hi, I need help for the below question, I didn't pass this question after my submission. Please help me for the coding. Thank you so much.
Question:
Part 2 [10 points]: Modify the class above to implement a KNN classifier. There are three methods that you need to complete:
predict: Given an mp matrix of validation data with m examples each with p features, return a length-m vector of predicted labels by calling the classify function on each example.
classify: Given a single query example with p features, return its predicted class label as an integer using KNN by calling the majority function.
majority: Given an array of indices into the training set corresponding to the K training examples that are nearest to the query point, return the majority label as an integer. If there is a tie for the majority label using K nearest neighbors, reduce K by 1 and try again. Continue reducing K until there is a winning label.
Notes:
Don't even think about implementing nearest-neighbor search or any distance metrics yourself. Instead, go read the documentation for Scikit-Learn's BallTree object. You will find that its implemented query method can do most of the heavy lifting for you.
Do not use Scikit-Learn's KNeighborsClassifier in this problem. We're implementing this ourselves.
My submission for the answer:
class KNN: """ Class to store data for regression problems """ def __init__(self, x_train, y_train, K=5): """ Creates a kNN instance :param x_train: numpy array with shape (n_rows,1)- e.g. [[1,2],[3,4]] :param y_train: numpy array with shape (n_rows,)- e.g. [1,-1] :param K: The number of nearest points to consider in classification """ # Import and build the BallTree on training features from sklearn.neighbors import BallTree self.balltree = BallTree(x_train) # Cache training labels and parameter K self.y_train = y_train self.K = K def majority(self, neighbor_indices, neighbor_distances=None): """ Given indices of nearest neighbors in training set, return the majority label. Break ties by considering 1 fewer neighbor until a clear winner is found. :param neighbor_indices: The indices of the K nearest neighbors in self.X_train :param neighbor_distances: Corresponding distances from query point to K nearest neighbors. """ # your code here self.neighbor_indices = neighbor_indices self.neighbor_distances = None self.X_train = train_x #distance = list() # [] neighbor_distance = list() # [] data = [] for i in neighbor_indices: dist = euc_dist(neighbor_indices, i) neighbor_distance.append(dist) data.append(i) neighbor_distance = np.array(neighbor_distances) data = np.array(data) #Finding the index in ascending order index_dist = neighbor_distance.argsort() #Arranging data according to index data = data[index_dist] #slicing k value from number of data majority_label = data[:neighbor_distances] return majority_label def classify(self, x): """ Given a query point, return the predicted label :param x: a query point stored as an ndarray """ # your code here Classes = [] for i in Neighbors: Classes.append(i[-1]) predicted_label = max(Classes, key= Classes.count) return predicted_label def predict(self, X): """ Given an ndarray of query points, return yhat, an ndarray of predictions :param X: an (m x p) dimension ndarray of points to predict labels for """ # your code here predictions = [] for i in range(len(val_x)): dist = np.array([euc_dist(val_x[i], x_t) for x_t in self.train_x]) dist_sorted = dist.argsort()[:self.K] neigh_count = {} for idx in dist_sorted: if self.Y_train[idx] in neigh_count: neigh_count[self.train_y[idx]] += 1 else: neigh_count[self.train_y[idx]] = 1 sorted_neigh_count = sorted(neigh_count.items(), key=operator.itemgetter(1), reverse=True) predictions.append(sorted_neigh_count[0][0]) return predictions
Please help me on the coding. Thank you.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started