Consider a situation where there is a cost that is either incurred or not. It is incurred

Question:

Consider a situation where there is a cost that is either incurred or not. It is incurred only if the value of some random input is less than a specified cutoff value. Why might a simulation of this situation give a very different average value of the cost incurred than a deterministic model that treats the random input as fixed at its mean? What does this have to do with the "flaw of averages"?
Fantastic news! We've Found the answer you've been seeking!

Step by Step Answer:

Related Book For  book-img-for-question

Business Analytics Data Analysis And Decision Making

ISBN: 1209

6th Edition

Authors: S. Christian Albright, Wayne L. Winston

Question Posted: