Question
Fix the following C program so it compiles and runs. #include stdio.h #include unistd.h #include stdlib.h #include string.h #include pthread.h #include semaphore.h void * thread_function(void
Fix the following C program so it compiles and runs.
#include stdio.h
#include unistd.h
#include stdlib.h
#include string.h
#include pthread.h
#include semaphore.h
void * thread_function(void *arg);
sem_f bin_sem;
#define party_size 1024
/ definition function /
char party_area[party_size];
int main()
{
int result;
pthread_f a_thread;
void *thread_result;
result= sem_init(&bin_sem,0,0);
{
if (result != 0)
{
perror(" the partier waits for the pledge");
exit(exit_failure);
}
printf(" the drink is not filled with keg");
if(result != 0)
{
perror(" the partier wake sup the pledge");
exit(exit_success);
}
printf(" the drink is filled with the new keg");
sem_wait (&bin_sem);
}
pthread_exit(null);
}
answer all questions
(a) Why is a shared second-level (L2) cache typically divided into multiple banks (banked) in a chip multiprocessor? [3 marks] (b) In what situation might a shared second-level cache offer a performance advantage over a memory hierarchy for a chip multiprocessor with private L2 caches? [4 marks] (c) A cache controller in a chip multiprocessor snoops the bus and observes a transaction that refers to a block that its cache contains. The block is held in State M (Modified). The bus transaction has been generated by a processor wishing to read the block. Assuming a MSI (write-back invalidate) cache coherence protocol, what actions will be taken by the cache controller? (6 marks] (d) How does adopting an inclusion policy simplify the implementation of a cache coherence mechanism in a chip multiprocessor with private LI and L2 caches? [4 marks] (e) How might multiple buses be exploited to enable a greater number of processors to be supported by a snoopy cache coherence protocol? [3 marks](a) Let X and Y be two discrete random variables whose respective sets of possible outcomes {r } and {y} are described by probability distributions p(r) and p(y), and by a joint probability distribution p(r, y). (i) Give an expression for the mutual information I( X; Y) between X and Y using only the probability distributions p(r), p(y), and p(x, y). [2 marks] (ii) In case X and Y are independent random variables, what becomes of their mutual information, and why? [1 mark] (in) Let the marginal entropy of random variable X be H(X), and suppose that the two random variables X and Y are perfectly correlated with each other. In that case, prove that I(X; Y) = H(X). [2 marks] (iv) What is / (X; X), the mutual information of a random variable with itself, in terms of H(X )? [1 mark] (b) Prove that the information measure is additive: that the information gained from observing the combination of / independent events, whose probabilities are p; for i = 1....N, is the sum of the information gained from observing each one of these events separately and in any order. [3 marks]Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started