# Calculating Time Complexity

Page 1 of 1

## 14 Replies - 9095 Views - Last Post: 04 November 2009 - 02:29 PM

### #1 Ted_

Reputation: 1
• Posts: 24
• Joined: 19-May 09

# Calculating Time Complexity

Posted 30 October 2009 - 07:34 AM

Hi all,

Could you check if my calculations are OK?

```
public static final int F(int n) {
if (n < 1)
return 0;
if (n == 1)
return 1;
else
return 2*F(n-1)+F(n-2);
}

```

My calculations were based on this PDF (first example). I assumed the recursive call makes 3 operations before calling the function again(*, -, +) and did my calculations here (MathBin latex page).

I've come to the conclusion the growth is exponential. If there's anything wrong or not understandable, please let me know.

Thank you very much,
Ted.

Is This A Good Question/Topic? 1

## Replies To: Calculating Time Complexity

• MrCupOfT

Reputation: 2298
• Posts: 9,535
• Joined: 29-May 08

## Re: Calculating Time Complexity

Posted 30 October 2009 - 07:44 AM

Me I say it would be ~n^2

### #3 Ted_

Reputation: 1
• Posts: 24
• Joined: 19-May 09

## Re: Calculating Time Complexity

Posted 30 October 2009 - 07:52 AM

AdamSpeight2008, on 30 Oct, 2009 - 06:44 AM, said:

Me I say it would be ~n^2

### #4 LaFayette

• D.I.C Regular

Reputation: 43
• Posts: 326
• Joined: 24-November 08

## Re: Calculating Time Complexity

Posted 31 October 2009 - 04:39 AM

In the part where you conclude that you have found a pattern the constraint on k should probably k > 2 though. All in all it looks good though.

### #5 Vestah

Reputation: 21
• Posts: 86
• Joined: 15-October 09

## Re: Calculating Time Complexity

Posted 01 November 2009 - 04:19 AM

It looks as if you are calculating the value of the nth Pell number.
I admit I don't fully understand what you do.

Typically when calculating the time complexity of an algorithm you deal with the asymptotic running time with the big O notation for worst case running time and omega for best case running time. Theta is used for the best and worst case running time.
My guess is that you have to first tell what the running time at each line is and then tell what the running time of the algorithm is.
```	if (n < 1)  O(1)
return 0;  O(1)
if (n == 1)  O(1)
return 1;  O(1)
else
return 2*F(n-1)+F(n-2); T(n-1)+T(n-2)+O(1)

```

If n is less than 1 the running time of the algorithm is O(1)+O(1) = O(1)
If n == 1 then the running time is O(1)+O(1)+O(1) = O(1)
Otherwise it is O(1)+O(1)+T(n-1)+T(n-2)+O(1) = T(n-1)+T(n-2)+O(1)

T is a recursive function I have introduced which takes a single number n:
T(n) = O(1) when n <= 1
T(n) = T(n-1)+T(n-2)+O(1) when n > 1

Now the task in this algorithm is to calculate the value of that recursive function.
I suggest that you do it by induction.

### #6 Ted_

Reputation: 1
• Posts: 24
• Joined: 19-May 09

## Re: Calculating Time Complexity

Posted 02 November 2009 - 05:03 AM

Hi Vestah,

let me try your way. Please check my steps and forgive me as I will once more use recurrence relations instead of induction.

OK, we need to figure out the growth of the recursive call T(n) = 2*T(n-1)+T(n-2)+O(1). Note that

T(n-1) = 2*T(n-2)+T(n-3)+O(1)
T(n-2) = 2* T(n-3)+T(n-4)+O(1)

Now we know that T(n) = 2*T(n-1)+T(n-2)+O(1), but we also know that T(n-1) = 2*T(n-2)+T(n-3)+O(1). Thus, we can write

T(n) = 2*(2*T(n-2)+T(n-3)+O(1))+T(n-2)+O(1) = 5*T(n-2)+2*T(n-3) + 3*O(1)

again, we now know that T(n) = 5*T(n-2)+2*T(n-3) + 3*O(1) and we know T(n-2) = 2*T(n-3)+T(n-4)+O(1). Therefore

T(n) = 5*(2*T(n-3)+T(n-4)+O(1))+2*T(n-3) + 3*O(1) = 12*T(n-3) + 5*T(n-4) + 8*O(1)

you can probably see the pattern now. The calculation continues as I explained in the first post, so the result is going to be the same, that is
the recursive call has O(2^n).

Thank you for your post, It cleared up some of my doubts. Do you think my calculations are correct?

### #7 Vestah

Reputation: 21
• Posts: 86
• Joined: 15-October 09

## Re: Calculating Time Complexity

Posted 02 November 2009 - 06:22 AM

If the task is to show what the worst case running time of the algorithm is, then it's I think what you have done is sufficient.
If the task is to prove the worst case running time then I think you have to do the induction more formally.

To give you a feeling I will solved a similar albeit different recursive equation where n is a non.negative integer. (A natural numbers or 0)
f(n) = f(n-1) + f(n-1) + b if (n>0)
f(n) = a if (n = 0)

The solution is this:
f(n) = (a+b )*(2^n) - b

I verify the correctness by using induction. That is, I prove that the base case holds and that if it holds inductively. (If f(n) holds then it also holds for f(n+1)
For the base case:
f(0) = (a+b )*(2^0) - b
= (a+b )*1 - b
= a

So the base case holds, now I will prove that it holds inductive: (Notice that I use the recursive equation for calculating f(n))
f(n+1) = f(n) + f(n) + b
= ((a+b )*(2^n) - b ) + ((a+b )*(2^n) - b ) + b
= (a+b )*(2^n) + (a+b )*(2^n) - b
= 2*(a+b )*(2^n) - b
= (a+b )*(2^n+1) - b

Since both the base case and the inductive steps holds we can conclude that f(n) = (a+*(2^n) - b is indeed the same as the recursive solution.
For getting the formula in the first place I relied on my memory. I don't really remember any way other way than guessing unfortunately.
I do hope this helps you if you have to prove the worst case running time.

Note that if you have solved it for the fibonacci numbers then you practically have the solution.

Best regards,
Thomas

Edit: Darned smilies

This post has been edited by Vestah: 02 November 2009 - 06:27 AM

### #8 LaFayette

• D.I.C Regular

Reputation: 43
• Posts: 326
• Joined: 24-November 08

## Re: Calculating Time Complexity

Posted 02 November 2009 - 01:37 PM

Quote

If the task is to show what the worst case running time of the algorithm is, then it's I think what you have done is sufficient.
If the task is to prove the worst case running time then I think you have to do the induction more formally.

How is either case any different?

### #9 Vestah

Reputation: 21
• Posts: 86
• Joined: 15-October 09

## Re: Calculating Time Complexity

Posted 03 November 2009 - 02:19 AM

It may just have been my place of education where how formally the argument have to be depends on whether you have to show or prove it.

### #10 LaFayette

• D.I.C Regular

Reputation: 43
• Posts: 326
• Joined: 24-November 08

## Re: Calculating Time Complexity

Posted 03 November 2009 - 08:16 AM

I'd say that unrolling the equation or make an inductive argument are equally good here. The first method, even though mechanical, might be harder to get formally "correct", the second one on the other demands that you have a good guess to begin with.

### #11 Vestah

Reputation: 21
• Posts: 86
• Joined: 15-October 09

## Re: Calculating Time Complexity

Posted 04 November 2009 - 01:25 AM

Perhaps you are right, but I'd still argue that it's easier to make an error when unrolling the argument.
The relationship to the golden ratio is missing for example.

Reputation:

## Re: Calculating Time Complexity

Posted 04 November 2009 - 12:30 PM

The relationship to the golden ratio is there. The Fibonacci-like structure of the equation should have been the early sign of that. What we're trying to solve is a simple inhomogeneous linear second-order recurrence with constant coefficients (whew, I hope I got that name correctly). What do we know about these recurrences? Their solution is just a sum of the general solution to the corresponding homogeneous equation and a particular solution to the inhomogeneous equation. The equation is:

F(0) = 1 (single if-comparison is made)
F(1) = 2 (two if-comparisons are made)
F(n) = F(n-1) + F(n-2) + 3 (two if-comparisons and a multiplication. I ignored the addition, because multiplication dominates it. But the choice doesn't really matter.)

Lets put the equation into standard form. What we get is - F(n) - F(n-1) - F(n-2) = 3 It's homogeneous solution is almost identical to that of Fibonacci numbers, except that it has slightly different constant coefficients, due the fact that our initial conditions are different. It's particular solution is given by the equation c - c - c = 3 which simply means that c = -3. Because we can ignore all the constants the growth of F(n) is the same as the growth of Fibonacci numbers! And what is the dominant term of Fibonacci numbers? The golden ratio - φ. Ergo, F(n) belongs to O(φ^n).

### #13 LaFayette

• D.I.C Regular

Reputation: 43
• Posts: 326
• Joined: 24-November 08

## Re: Calculating Time Complexity

Posted 04 November 2009 - 01:51 PM

Nice, except thats the wrong recurrence The originial recurrence was: F(N) = 2*F(N-1) + F(N-2) which belongs to O(δ^n) where δ = 1 + sqrt(2).

Reputation:

## Re: Calculating Time Complexity

Posted 04 November 2009 - 02:16 PM

F(N) = 2*F(N-1) + F(N-2) is an incorrect recurrence that does not describe the algorithm that OP has presented. It assumes that F(n-1) is called twice. That is not the case

There are 4 operations that are being performed at the end of the algorithm:

1. Calculate F(n-1)
2. Calculate F(n-2)
3. Multiplication of #1
4. Addition of #2 and #3.

Therefore the number of operations that F(n) performs equals to F(n-1) + F(n-2) + constant.

Reputation:

## Re: Calculating Time Complexity

Posted 04 November 2009 - 02:29 PM

Ted has made an error in his original recurrence. Multiplication should be considered as a single operation. I've noticed it right away but did not say anything because I thought that all of you realized it by now, seeing how Vestah has used the correct recurrence.

Now, step back a little bit and look at the algorithm. You should notice right away the tree recursion. Every function call spawns 2 calls to itself. Such recursions are bounded by O(2^n). Therefore it makes no sense to make it belong to O((1 + √2)^n), because (1 + √2) > 2. If you think about it even more, you'll notice that the recursion produces an unbalanced tree, because one of the recursive calls goes 2 levels back. Therefore it is perfectly logical to expect it to be O(y^n) where y is slightly less than 2. That, and the fact that it's an almost identical Fibonacci numbers recursion.

This post has been edited by Danil: 04 November 2009 - 02:41 PM