CS 117
Midterm 2
Ondich
Due 12:30PM, Monday, March 10, 1997

This is an open-book, open-computer exam. You may use your textbook, your notes, your programs, and whatever information you find on the course Web page. You may not consult other books or on-line resources. Do not discuss this exam with people other than Jeff Ondich, who will be glad to answer your questions if he feels it appropriate to do so. If you have questions, ask them. It's not noble to suffer needlessly.

Please submit your exam on paper.

  1. (15 points) What's wrong with the function below? Fix it. (Assume nodePtr is a type defined as a pointer to some record type.)

    
    	{============================================================
    		Concatenate hooks the linked list whose head is list2
    		onto the end of the linked list whose head is list1,
    		and returns a pointer to the head of the combined list.
    	 ============================================================}
    	function Concatenate( list1, list2 : nodePtr ) : nodePtr;
    	var	current : nodePtr;
    	begin
    		if list1 = nil then
    			Concatenate := list2
    		else
    		begin
    			current := list1;
    			while current <> nil do
    				current := current^.next;
    				
    			current^.next := list2;
    			Concatenate := list1
    		end
    	end;
    

    To figure out what's going on with these linked lists, I recommend that you draw pictures.

  2. (10 points) Write a recursive function that takes an integer N as input (via a parameter, not via a read or readln) and prints out the integers 1 to N in reverse order. That is, your function should print out a countdown.

  3. (3 points) Tell me a joke, please.

  4. (15 points) The Selection Sort algorithm has a loop inside a loop.
    
        for i := 1 to N-1 do
        begin
            indexOfMax := i;
            for j := i + 1 to N do
            begin
                if a[indexOfMax] < a[j] then
                    indexOfMax := j;
            end;
    
            temp := a[indexOfMax];
            a[indexOfMax] := a[i];
            a[i] := temp
        end
    

    When we analyzed the time complexity of Insertion Sort, it became clear that the vast majority of the execution time was spent inside this inner loop, especially when the number of items we were sorting (N) was large. Similarly, with Selection Sort, the number of times j gets incremented or a[indexOfMax] gets compared to a[j] is much larger than the number of times i gets incremented or temp gets changed. Thus, the number of times the body of the inner loop gets executed gives a good indication of the amount of time this algorithm will take to execute.

    Consider now the Merge Sort algorithm in mergesort.p. This recursive algorithm is convoluted enough that it is not immediately obvious where the bulk of the computation time is being spent. In this problem, you're going to investigate this matter to determine whether Merge Sort might offer an improvement over Selection Sort.

    I have written the main program of mergesort.p to allow you to sort arrays of more or less any length you specify. Try running the program a couple of times with different input to see it work.

    Now it's time to get down to work. First of all, it is possible that most of the time spent in Merge Sort is spent just calling procedures. Thus, we'll want to know how many times Merge() and MergeSort() are called for various values of N, and whether there's a pattern to these numbers.

    On the other hand, it's possible that the bulk of the execution time takes place inside Merge(), which is a relatively complicated procedure. Once you get to know this procedure, you will agree that the number of operations performed by Merge() is approximately proportional to the "width" of the interval from bottom to top (that is, top - bottom + 1). For example, Merge() will take about three times as long to do its work if the width is 1500 than if it is 500, and so on. Thus, we'll want to know the sum of all the widths Merge() gets called with during execution of the overall MergeSort() algorithm.

    Here's what I want you to do. Modify the mergesort.p program to report to you the number of times Merge() gets called, the number of times MergeSort() gets called, and the sum of all the widths in all the calls to Merge(). Try this modified program out for a variety of values of N (I recommend powers of 2 for N to make some of the patterns that arise easier to see). Collect enough data in a table like the one below so you can see patterns forming. While you're at it, collect similar data for Selection Sort.

                        M  E  R  G  E     S  O  R  T               Selection Sort
                 # times           # times            total         # times body
                MergeSort()        Merge()         widths that      of inner loop
        N       gets called      gets called       get merged       gets executed 
    -----------------------------------------------------------------------------
       16           ??               ??                ??                ??
       32           ??               ??                ??                ??
       64           ??               ??                ??                ??
      etc.
    

    Give me a formula for each of the four columns of this table in terms of N. When N gets really large, which column dominates the execution time for Merge Sort? Which sorting algorithm would you expect to be faster for large N, or is there no noticeable difference?

    Finally, have a race between Merge Sort and Selection Sort on an array of, say, 10000 integers. Which one wins? By how much?