# Time complexity for linked lists

Page 1 of 1## 8 Replies - 5911 Views - Last Post: 19 April 2013 - 03:34 AM

### #1

# Time complexity for linked lists

Posted 17 April 2013 - 03:03 PM

a- O(n) linear time

b- O(log n) logarithmic time

c- O(1) constant time

d- O(n2) polynomial time

5- Consider a queue implemented using a doubly linked list, what is the time complexity of enqueue() assuming the start of the queue is the start of the list?

a- O(n) linear time

b- O(log n) logarithmic time

c- O(1) constant time

d- O(n2) polynomial time

Im kind of confused isnt a linked listed just a list that can have either one direction or two? how does that effect complexity. For a queue when you enqueue and the front of the list is at position 0 you have O(1) and when you dequeue you have O(n), whyy arent these the same answers for linked list??

Answers are A and C, thank you

##
**Replies To:** Time complexity for linked lists

### #2

## Re: Time complexity for linked lists

Posted 17 April 2013 - 05:31 PM

medaa, on 17 April 2013 - 03:03 PM, said:

a- O(n) linear time

b- O(log n) logarithmic time

c- O(1) constant time

d- O(n2) polynomial time

5- Consider a queue implemented using a doubly linked list, what is the time complexity of enqueue() assuming the start of the queue is the start of the list?

a- O(n) linear time

b- O(log n) logarithmic time

c- O(1) constant time

d- O(n2) polynomial time

Im kind of confused isnt a linked listed just a list that can have either one direction or two? how does that effect complexity. For a queue when you enqueue and the front of the list is at position 0 you have O(1) and when you dequeue you have O(n), whyy arent these the same answers for linked list??

Answers are A and C, thank you

Big Oh notation is something that can be very hard to understand, and I have taken many classes and it can still be hard to get, but lets take a look here.

When implementing a queue you can choose to insert in two ways, but not matter what way you choose the remove or insert will be affected.

1.) If you want to have constant time insertion:

Inserting on the front of a linked list is constant time because you have a pointer to the front of the list, you just make a new node and fix up the pointers.

public void insert(int val)//inserts at beginning of list { Node newNode = new Node(val); newNode.next = first; first = newNode; }

Above is code for inserting on the front of the linked list, if you analyze the code you will see all the operations can be done in constant time. If you choose to do this your dequeue will be affected because deletion is no longer constant, you have to traverse the linked list and remove the last element the following is code to do this.

public Node delete(int val) { Node current = first; Node previous = first; while(current.item != val) { if(current.next == null) return null; else { previous = current; current = current.next; } } if(current == first) first = first.next; else previous.next = current.next; return current; }

As you can see this code has a loop that has to run N times because you have to traverse the entire list to do the removal.

2.) If you chose to have removal a constant time operation you will just reverse these functions, not you would have to traverse the list to find the end and then do some more pointer arithmetic to hook up the list.

This means the way you implement the linked list on a queue matters not matter what way you choose you have to deal with one O(n) operation for either removal or deletion.

If you want to have a queue that is bound above by some constant you can implement a queue using two stacks.

Here is a little chart showing some running times:

Please let me know if you would like some more explanation on this.

### #3

## Re: Time complexity for linked lists

Posted 18 April 2013 - 04:14 AM

medaa, on 18 April 2013 - 12:03 AM, said:

If by two directions you mean that each node knows its predecessor and its successor (as opposed to knowing just one of the two, which would be "one direction"), then yes, that's true.

Quote

If you have a reference to the last node of the list, then you can delete that node, by setting its predecessor's successor to None. If the node knows its predecessor that just means the following code:

def dequeue(this): second_to_last_node = this.last_node.predecessor second_to_last_node.successor = None this.last_node = second_to_last_node

If the node doesn't know its predecessor, you'd need to loop through the entire list to find the second to last node.

Quote

The assignment's comparing a queue that's implemented as a singly linked list versus a queue that's implemented as a doubly linked list. You're talking like it's comparing a queue with a linked list. I'm not sure what you mean by that because both kinds of queues that it's comparing are linked lists.

Romeo1, on 18 April 2013 - 02:31 AM, said:

That's not true. When using a

*doubly*linked list, both enqueue and dequeue are O(1) - no matter which end of the list you define to be the start of the queue. The direction only matters for singly linked lists.

Also a singly linked list can implement both enqueue and dequeue if we define the beginning of the queue to be the end of the list instead of the beginning. That is adding to the end of a singly linked and removing from the beginning of a singly linked list can both be implemented in O(1) time.

This post has been edited by **sepp2k**: 18 April 2013 - 04:18 AM

### #4

## Re: Time complexity for linked lists

Posted 18 April 2013 - 07:19 AM

Take the queue a,b,c,d, where a is at the front of the queue (ie, the next item to be removed) and d is at the tail. (you need four deq operations to get d).

Suppose first that this queue is built on a singly-linked list, and the start of the queue is at the head of the list. What do you have to do to enqueue node e? What nodes do you have to touch? How about for the dequeue operation? Then assume that the queue is built on a doubly-linked list, and ask the same questions.

Again, it helps to draw a picture. (a whiteboard is an invaluable tool for thinking about code)

This post has been edited by **jon.kiparsky**: 18 April 2013 - 07:19 AM

### #5

## Re: Time complexity for linked lists

Posted 18 April 2013 - 04:26 PM

sepp2k, on 18 April 2013 - 04:14 AM, said:

medaa, on 18 April 2013 - 12:03 AM, said:

If by two directions you mean that each node knows its predecessor and its successor (as opposed to knowing just one of the two, which would be "one direction"), then yes, that's true.

Quote

If you have a reference to the last node of the list, then you can delete that node, by setting its predecessor's successor to None. If the node knows its predecessor that just means the following code:

def dequeue(this): second_to_last_node = this.last_node.predecessor second_to_last_node.successor = None this.last_node = second_to_last_node

If the node doesn't know its predecessor, you'd need to loop through the entire list to find the second to last node.

Quote

The assignment's comparing a queue that's implemented as a singly linked list versus a queue that's implemented as a doubly linked list. You're talking like it's comparing a queue with a linked list. I'm not sure what you mean by that because both kinds of queues that it's comparing are linked lists.

Romeo1, on 18 April 2013 - 02:31 AM, said:

That's not true. When using a

*doubly*linked list, both enqueue and dequeue are O(1) - no matter which end of the list you define to be the start of the queue. The direction only matters for singly linked lists.

Also a singly linked list can implement both enqueue and dequeue if we define the beginning of the queue to be the end of the list instead of the beginning. That is adding to the end of a singly linked and removing from the beginning of a singly linked list can both be implemented in O(1) time.

A doubly linked list does not have constant insertion on the end unless you keep a pointer to the last element, all that list has is a pointer pointing back down the list making insertion easier.

### #6

## Re: Time complexity for linked lists

Posted 18 April 2013 - 05:06 PM

Romeo1, on 19 April 2013 - 01:26 AM, said:

And why wouldn't you do that? I've never seen an implementation of a doubly linked list that did not keep a pointer to the last node.

Note that even the chart you've posted agrees that insertion at the end is O(1) for linked lists.

This post has been edited by **sepp2k**: 18 April 2013 - 05:08 PM

### #7

## Re: Time complexity for linked lists

Posted 18 April 2013 - 05:12 PM

sepp2k, on 18 April 2013 - 05:06 PM, said:

Romeo1, on 19 April 2013 - 01:26 AM, said:

And why wouldn't you do that? I've never seen an implementation of a doubly linked list that did not keep a pointer to the last node.

Note that even the chart you've posted agrees that insertion at the end is O(1) for linked lists.

yes my chart assumes a correctly implemented linked list, but these questions say nothing about this, so I took the worst case senario.

### #8

## Re: Time complexity for linked lists

Posted 18 April 2013 - 05:15 PM

### #9

## Re: Time complexity for linked lists

Posted 19 April 2013 - 03:34 AM