Question
Suppose that a certain airport has one runway. Each airplane takes landingTime minutes to land and takeOffTime to take off, and that, on the average,
Suppose that a certain airport has one runway. Each airplane takes landingTime minutes to land and takeOffTime to take off, and that, on the average, planes arrive at random instants of time. There are two types of queues: a queue of airplanes waiting to land and a queue of airplanes waiting to take off. Because it is more expensive to keep a plane airborne than have one waiting on the ground, we assume that the airplanes in the landing queue have prioritiy over those in the takeoff queue.
Write a program to simulate this airport's operation. You might assume a simulated clock that advances in five-minute intervals. For each tic generate two random numbers. If the first is less than landingRate, a landing arrival has occurred and is added to the landing queue; and if the second is less than takeOffRate, a takeoff arrival has occurred and is added to the takeoff queue. Your program should also calculate the average queue length and the average time that an airplane spends in a queue.
Use a linked list implementation of a queue in Java
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started