Question
This is from Cracking the Coding Interview by Gayle Lackmann McDowell. I have a question on her solution. Suppose we had an algorithm that took
This is from Cracking the Coding Interview by Gayle Lackmann McDowell. I have a question on her solution.
Suppose we had an algorithm that took in an array of strings, sorted each string, and then sorted the full array. What would the runtime be?
Let a be the length of the array and s be the length of the longest string.
- Sorting each string is O(s log s) (in general, sorting algorithms take O(n log n) )
- Since we have to sort each string, and there are a strings, we have O(a * s log s)
- Now, to sort all the strings, since each string comparision takes O(s) time and there are O(a log a) comparisons, this will take O(a * s log a) time.
Adding both parts up: O(a* s log a a s log s) = O(a * s (log s + log a) )
My question is why does she add the runtimes at the end instead of multiply them? for each string added to the array we will have one more string to sort both intiatially and at the end in the array. Thus, you should multiply the times.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started