Question
1. Which data structure would you choose to help maintain multiple key/value associations, and where a full-ordering of keys or values is not important? (a)
1. Which data structure would you choose to help maintain multiple key/value associations, and where a full-ordering of keys or values is not important?
(a) a list (b) a stack (c) a queue (d) a hashtable (e) a heap
2. Which data structure would you choose to help track a collection of objects where only the object with the largest value needs to be readily accessible?
(a) a list (b) a stack (c) a queue (d) a hashtable (e) a heap
3. Which data structure would you choose to help track a collection of objects where only the object that has been tracked the longest needs to be readily accessible?
(a) a list (b) a stack (c) a queue (d) a hashtable (e) a heap
4. What is the run-time complexity of locating and retrieving the element in the middle (by index) of a circular, doubly-linked list with a sentinel head?
(a) O(1) (b) O(log N) (c) O(N) (d) O(N log N) (e) O(N2 )
5. What is the run-time complexity of determining whether a specified key exists in a hashtable containing N key/value pairs?
(a) O(1) (b) O(log N) (c) O(N) (d) O(N log N) (e) O(N2 )
6. What is the run-time complexity of locating (but not removing) the maximum element in a max-heap of N elements?
(a) O(1) (b) O(log N) (c) O(N) (d) O(N log N) (e) O(N2 )
7. What is the run-time complexity of inserting a new element into a max-heap of N elements and ensuring the heap property is maintained?
(a) O(1) (b) O(log N) (c) O(N) (d) O(N log N) (e) O(N2 )
8. A student has proposed the following method as a faster way of counting the number of elements in a doubly-linked list with a sentinel head node.
def fast_count(self):
n = 0
h = self.head.next
while h is not self.head:
if h.next is not self.head:
n, h = n+2, h.next.next
else: n, h = n+1, h.next
return n
What is the time complexity of fast_count when run on a list with N elements?
(a) O(1) (b) O(log N) (c) O(N) (d) O(N log N) (e) O(N2 )
9. In what scenario will constructing a max-heap from N elements (added one at a time) be carried out most efficiently?
(a) when the elements are added in ascending order
(b) when the elements are added in descending order
(c) when the elements are added in random order
(d) when a separate min-heap is used to simultaneously keep track of the smallest element
(e) all the above yield the same runtime efficiency
12. Consider the following definition of a hashtable with a partially implemented rehash method, which will either grow or shrink the buckets array to the provided value n_buckets (while keeping all existing key/value mappings).
class Hashtable:
def rehash(self, n_buckets):
new_buckets = [None] * n_buckets
for b in self.buckets:
while b:
b_next = b.next
________________________________
________________________________
________________________________
b = b_next
self.buckets = new_buckets
Which correctly completes the rehash method?
(a) idx = hash(b.key) % n_buckets
b.next = new_buckets[idx]
new_buckets[idx] = b
(b) idx = hash(b.key) % len(self.buckets)
b.next = new_buckets[idx]
self.buckets[idx] = b
(c) idx = hash(b.key) % len(self.buckets)
self.buckets[idx] = new_buckets[idx]
b.next = new_buckets[idx]
(d) idx = hash(b.key) % n_buckets
new_buckets[idx].next = b
b.next = new_buckets[idx]
(e) idx = hash(b.key) % n_buckets
b.next = new_buckets[idx].next
new_buckets[idx].next = b
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started