Minima in permutations. Write a program that takes an integer commandline argument n, generates a random permutation,
Question:
Minima in permutations. Write a program that takes an integer commandline argument n, generates a random permutation, prints the permutation, and prints the number of left-to-right minima in the permutation (the number of times an element is the smallest seen so far). Then write a program that takes two integer command-line arguments m and n, generates m random permutations of length n, and prints the average number of left-to-right minima in the permutations generated. Extra credit : Formulate a hypothesis about the number of left-toright minima in a permutation of length n, as a function of n.
Fantastic news! We've Found the answer you've been seeking!
Step by Step Answer:
Related Book For
Introduction To Programming In Java An Interdisciplinary Approach
ISBN: 9780672337840
2nd Edition
Authors: Robert Sedgewick, Kevin Wayne
Question Posted: