What are doubly linked lists again?
Doubly linked lists are one of the first data structures taught in any computer science curriculum.
- iterating from both ends
- adding an item in constant time
- removing an item in constant time
Disavantages of doubly linked lists
However, doubly linked lists require allocating 2 pointers per element, and looping on linked lists is much less cache-efficient efficient than looping on arrays.
Another catch of doubly linked lists is that constant time operations only work when you know where the node you want to make an operation on is. In the figure above, you need first to access the node with value
99 to be able to remove it.
Fortunately, the most common use case is appending or removing at the beginning or the end of the list, and you always keep a pointer to those elements. However, in that case, you actually want a double-ended queue, or deque. It can be efficiently implemented using a circular buffer, and most languages provide a deque in the standard library anyway (Python, C++).
In a nutshell, if you consider using your doubly linked list as a container, in most cases you should use another data structure, and there is almost no good use case for C++ list which implements doubly linked lists.
An algorithmic problem
Let’s take a break and consider the following problem:
Given a permutation
a, ..., a[n-1]of
0, ..., n-1, compute
lo[i], the smallest number located after position
iand greater than
a = [1, 0, 4, 5, 2, 3] gives
lo = [2, 2, 5, None, 3, None].
An immediate solution is to use an ordered set. Going from right to left, read the number
x, output the smallest number more than
x and add
x to the set. This gives a
O(n log n) complexity.
Let us make an observation: given
a and the reversed
a , one can sort
O(n) (left as an exercise to the reader).
a was not a permutation, we know that sorting it with a comparison-based sort would require at least
O(n log n) operations.
O(n log n) is already the best one can achieve on the generalized problem where
a is not a permutation.
A linear-time solution based on doubly linked lists
I claim that this problem can be solved in linear time.
The idea is the following:
- maintain a doubly linked list
lof numbers from
- remove the elements of
l, going from
- when removing an element, look at its right pointer in
It works because at every step, the elements present in
l are the same.
l is always ordered so the right pointer just indicated the smallest larger element. Since we remove the elements from the left to the right, the right pointer of
l corresponds exactly to the smallest larger element located on the right.
def main(a): n = len(a) left = [None] + list(range(n - 1)) right = list(range(1, n)) + [None] lo =  for i, x in enumerate(a): l = left[x] r = right[x] if l is not None: right[l] = r if r is not None: left[r] = l lo.append(r) return lo
The trick of this solution is to remove elements from the doubly linked list. This allows to store it in an array and obtain constant-time access to any element. At the moment, it is the only example of problem that needs the doubly linked list data structure to achieve the optimal complexity that I know (see below for an update).
The problem MAT from CEOI 2011 uses this trick among others. I encourage you to do it!
Update (20/12/2021): problems that need and don’t need linked lists
Two friends (Christoph and Ismael) reacted to this post, pointing me to other problems where linked lists are useful.
The most well known is the Dancing Links technique that is used in Knuth’s Algorithm X, solving the exact cover problem. Dancing links is a very useful technique to backtrack easily. However, it is also possible to implement Algorithm X without any linked list, as shown in Ali Assaf’s code.
Another common data structure that requires linked lists is LRU cache or any other structure that requires reordering existing elements. A good implementation is Python’s
collections.OrderedDict that you can find here.
Partition refinement is often implemented with linked lists, but it does not have to: see Eppstein’s implementation. However, I think Lex-BFS needs doubly linked lists (on top of a partition refinement data structure) to work. You can find Eppstein’s implementation there.