Question: In the context of the Web, the main applications include building meta-search engines, combining ranking functions, selecting documents based on multiple criteria. When we use
In the context of the Web, the main applications include building meta-search engines, combining ranking functions, selecting documents based on multiple criteria. When we use Google search engine to search information, Google always show us a long list links one by one. Suppose Google considers combining ranking results from different sources. At beginning they treat each source equally. In other words, they just sum all ranks from various sources for each web page, and use the summation to generate the combined rank. However, they want to investigate the reliability of each source and assign high weight to the most reliable one in the future rank combination. Here we simply define the reliability is inversely proportional to the number of inversions between the rank from a source with the combined rank. That is, the source is more reliable if it has fewer inversions.
Acquirement:
Sorting is one of the most broadly used fundamental operations in data processing. There are numerous sorting algorithms available. In this project, you are asked to modify the existing algorithms for the real-world applications. You are required to modify six sorting algorithms for solving this problem: Two O(n log n) algorithms (merge sort and quick sort), bubble sort, insertion sort , heap sort and inversion sort
source1.txt:
678
787
8798
8799
8987
887
source2.txt:
57
5654
678
665
7876
7876
source3:
685
385
574
45
374
35
source4.txt:
4556
787
898
797
98
776
source5.txt:
465
678
67
7886
765
you can write this program using java or c++ and please add comments
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
