I am confused as to why merge sort for a list of size 2^n takes n steps.
If you have n=4 lists 3,9,8,10 and you sort from least to greatest:
3 9 8 10
(3,9) (8,10) 2 work to compare 3 with 9 and 8 with 10.
(3,8,9,8,10) 3 work to compare 3 with 8, 8 with 9, and 9 with 10.
That's a total of 5 work, not 4. Every definition out there says it takes 4 steps to merge when it obviously takes 5 steps in the example given.
I meant:
(3,8,9,10) 3 work to compare 3 with 8, 8 with 9, and 9 with 10.
But I can't find edit button.
Mergesort for size 2^n
Page 1 of 12 Replies - 2699 Views - Last Post: 23 March 2014 - 12:35 PM
Replies To: Mergesort for size 2^n
#2
Re: Mergesort for size 2^n
Posted 22 March 2014 - 01:09 PM
Edit:
The first line should be: I am confused as to why merging for a list of size 2^k=n takes n steps.
The first line should be: I am confused as to why merging for a list of size 2^k=n takes n steps.
#3
Re: Mergesort for size 2^n
Posted 23 March 2014 - 12:35 PM
Moved to Computer Science.
Merging is linear. We are given two arrays: arr1 and arr2, and a target array result.
Let's look at some pseudo-code:
So arr1.length is generally equal to arr2.length, give or take an element. So our first outer while loop will take O(n) time, where n is the length of the result array. The inner loops don't re-traverse any elements, so there is no additional complexity by having them. We then look the outer two loops. Only one of them will execute, based on the values of leftIndex and rightIndex after the first outer-while loop. We are traversing the remainder of the result array. Hence, O(n) complexity.
Merging is linear. We are given two arrays: arr1 and arr2, and a target array result.
Let's look at some pseudo-code:
merge(arr1, arr2)
result := array[arr1.length + arr2.length]
leftIndex := 0
rightIndex := 0
resultIndex := 0
while leftIndex < arr1.length AND rightIndex < arr2.length
while arr1[leftIndex] <= arr2[rightIndex]
result[resultIndex++] := arr1[leftIndex++]
while arr2[rightIndex] < arr1[leftIndex]
result[resultIndex++] := arr2[rightIndex++]
end while
while leftIndex < arr1.length
result[resultIndex++] := arr1[leftIndex++]
while rightIndex < arr2.length
result[resultIndex++] := arr2[rightIndex++]
return result
end function
So arr1.length is generally equal to arr2.length, give or take an element. So our first outer while loop will take O(n) time, where n is the length of the result array. The inner loops don't re-traverse any elements, so there is no additional complexity by having them. We then look the outer two loops. Only one of them will execute, based on the values of leftIndex and rightIndex after the first outer-while loop. We are traversing the remainder of the result array. Hence, O(n) complexity.
Page 1 of 1

New Topic/Question
Reply


MultiQuote





|