Question
Apply an exponential regression to real-world data. Beginning in the 1970s, computers have steadily decreased in size as they have grown in power. The ability
Apply an exponential regression to real-world data. Beginning in the 1970s, computers have steadily decreased in size as they have grown in power. The ability to have more computing potential in a four-pound laptop than in a mainframe of the 1970s is a result of engineers squeezing more and more transistors onto silicon chips. The rate at which this miniaturization occurs is known as Moore's law, after Gordon Moore, one of the founders of Intel Corporation. His prediction, first articulated in 1965, was that the number of transistors per chip would double every eighteen months.
The following table lists some of the growth benchmarks-namely, the number of transistors per chip- associated with the Intel chips marketed over the period from 1974 through 2016.
https://docs.google.com/spreadsheets/d/1RJmlQ5ntA4aHwMfLGlzo-WNj5vu13ZTZkyTf0mQm73Q/edit?usp=sharing
If we graph the data, we can see that the graph of transistors per chip (y) and years since 1974 (x) is clearly not linear. Moore's law claims that the number of transistors per chip would double every eighteen months. If this were true, then the data would fit the exponential model:
y = a 2^bx where a and b are constants, and b > 0.
FOLLOWING QUESTIONS: (c) Using the data, compute for the values of a and b (rounded to 4 decimal places) and write the best-fit curve as an equation of the form y = a 2^bx. Based on your equation, how long does it take on average for the number of transistors per chip to double?
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started