## What are doubly linked lists again?

Doubly linked lists are one of the first data structures taught in any computer science curriculum.

They allow:

- iterating from both ends
- adding an item
*in constant time* - removing an item
*in constant time*

## Disavantages of doubly linked lists

However, doubly linked lists require allocating 2 pointers per element, and looping on linked lists is much less cache-efficient efficient than looping on arrays.

Another catch of doubly linked lists is that constant time operations only work when you know where the node you want to make an operation on is. In the figure above, you need first to access the node with value `99`

to be able to remove it.

Fortunately, the most common use case is appending or removing at the beginning or the end of the list, and you always keep a pointer to those elements. However, in that case, you actually want a double-ended queue, or *deque*. It can be efficiently implemented using a circular buffer, and most languages provide a deque in the standard library anyway (Python, C++).

In a nutshell, if you consider using your doubly linked list as a container, in most cases you should use another data structure, and there is *almost* no good use case for C++ list which implements doubly linked lists.

## An algorithmic problem

Letâ€™s take a break and consider the following problem:

Given a permutation

`a[0], ..., a[n-1]`

of`0, ..., n-1`

, compute`lo[i]`

, the smallest number located after position`i`

and greater than`a[i]`

for all`i`

For example, `a = [1, 0, 4, 5, 2, 3]`

gives `lo = [2, 2, 5, None, 3, None]`

.

An immediate solution is to use an ordered set. Going from right to left, read the number `x`

, output the smallest number more than `x`

and add `x`

to the set. This gives a `O(n log n)`

complexity.

Let us make an observation: given `lo`

for `a`

and the reversed `a`

, one can sort `a`

in `O(n)`

(left as an exercise to the reader).
If `a`

was not a permutation, we know that sorting it with a comparison-based sort would require at least `O(n log n)`

operations.
Hence, `O(n log n)`

is already the best one can achieve on the generalized problem where `a`

is not a permutation.

## A linear-time solution based on doubly linked lists

I claim that this problem can be solved in linear time.

The idea is the following:

- maintain a doubly linked list
`l`

of numbers from`0`

to`n-1`

- remove the elements of
`a`

from`l`

, going from`a[0]`

to`a[n-1]`

- when removing an element, look at its right pointer in
`l`

It works because at every step, the elements present in `a`

and `l`

are the same. `l`

is always ordered so the right pointer just indicated the smallest larger element. Since we remove the elements from the left to the right, the right pointer of `l`

corresponds exactly to the smallest larger element located on the right.

## Implementation

```
def main(a):
n = len(a)
left = [None] + list(range(n - 1))
right = list(range(1, n)) + [None]
lo = []
for i, x in enumerate(a):
l = left[x]
r = right[x]
if l is not None:
right[l] = r
if r is not None:
left[r] = l
lo.append(r)
return lo
```

## Conclusion

The trick of this solution is to *remove* elements from the doubly linked list.
This allows to store it in an array and obtain constant-time access to any element.
At the moment, it is the only example of problem that needs the doubly linked list data structure to achieve the optimal complexity that I know.

The problem MAT from CEOI 2011 uses this trick among others. I encourage you to do it!