Question
Consider one way of modeling a problem. Say you have a device and in any given year, there is a 1 in 6 chance that
Consider one way of modeling a problem. Say you have a device and in any given year, there is a 1 in 6 chance that the device will fail. One question is, "On average, how long will it be before such a device fails?" This sort of problem is what will be modeled and analyzed here.
The problem above can be modeled as follows: Roll a normal fair six-sided die. Rolling a 1 will count as "fail". Anything else will count as "not-fail". The random variable we will use will be
X= # of rolls required until a 1 is rolled = "time until failure"
Simulate 20 repetitions of obtaining a value for , that is, roll until a 1 is rolled, record the result and repeat this 20 times. For example, if you roll , then record 4 since 4 rolls were required. You can use a die or simulating using Excel, Python, or some other option, e.g., Random.org.
Record your rolls and counts.
Compute the mean and standard deviation of your values for .
Given the modeling problem we started with, interpret the mean and standard deviation you found in terms of how many years are expected before the device fails.
Excel might be used for this part as it makes the calculations, recording of data, etc. very simple. However, it is not required.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started