Question
Consider the following scheduling problem: we have two machines, and a set of jobsj1, j2, j3, . . . , jn that we have to
Consider the following scheduling problem: we have two machines, and a set of jobsj1, j2, j3, . . . , jn that we have to process. To process a job, we place it on a machine; each machine can only process one job at a time. Each job ji has an associated running time ri. The load on the machine is the sum of the running times of the jobs placed on it. The goal is to minimize the completion time, which is the maximum load over all machines. Suppose we adopt a greedy algorithm: each job ji is put on the machine with the minimum load after the firsti1jobs. (Ties can be broken arbitrarily.)
(a) For all n >3, give an instance of this problem for which the output of the greedy algorithm is a factor of 3/2away from the best possible placement of jobs.
(b)Prove that the greedy algorithm always yields a completion time within a factor of 3/2 of the best possible placement of jobs. (Hint: Think of the best possible placement of jobs. Even for the best placement, the completion time is at least as big as the biggest job, and at least as big as half the sum of the jobs. You may want to use both of these facts.)
(c)Suppose now instead of 2 machines we have m machines. Prove the best upper bound you can on the ratio of the performance of the greedy solution to the optimal solution, as a function of m?
(d)Give a family of examples (that is, one for each m if they are very similar, it will be easier to write down!) where the factor separating the optimal and the greedy solutions is as large as you can make it.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started