Answered step by step
Verified Expert Solution
Link Copied!

Question

1 Approved Answer

Consider the following scheduling problem: we have two machines, and a set of jobs j 1 , j 2 , j 3 , ..., j

Consider the following scheduling problem: we have two machines, and a set of jobs j1, j2, j3, ..., jn that we have to process one at a time. To process a job, we place it on a machine. Each job ji has an associated running time ri. The load on the machine is the sum of the running times of the jobs placed on it. The goal is to minimize the completion time, which is the maximum load over all machines.

Suppose we adopt a greedy algorithm: each job ji is put on the machine with the minimum load after the first image text in transcribed jobs. (Ties can be broken arbitrarily.) Show that this strategy yields a completion time within a factor of 3/2 of the best possible placement of jobs. (Hint: Think of the best possible placement of jobs. Even for the best placement, the completion time is at least as big as the biggest job, and at least as big as half the sum of the jobs. You may want to use both of these facts.) Give an example where a factor of 3/2 is achieved.

Suppose now instead of 2 machines we have m machines. What is the performance of the greedy solution, compared to the optimal, as a function of m? Give a family of examples (that is, one for each m - if they are very similar, it will be easier to write down!) where the factor separating the optimal and the greedy solutions is as large as you can make it.

Edit: what information are you looking for? Your comment is very vague.

Step by Step Solution

There are 3 Steps involved in it

Step: 1

blur-text-image

Get Instant Access with AI-Powered Solutions

See step-by-step solutions with expert insights and AI powered tools for academic success

Step: 2

blur-text-image

Step: 3

blur-text-image

Ace Your Homework with AI

Get the answers you need in no time with our AI-driven, step-by-step assistance

Get Started

Students also viewed these Databases questions