跳转至

6 Queues

Everyone is familiar with waiting in line. Whether you are in line to buy tickets to your favorite movie or waiting for a printer to print a file, these real-life scenarios mimic the queue data structure.

Queues use FIFO (first-in-first-out) ordering, meaning the first element that was added will always be the first to be removed. Queues are handy when you need to maintain the order of your elements to process later.

In this chapter, you’ll learn all the common operations of a queue, go over various ways to implement a queue, and look at the time complexity of each approach.

Common Operations

The following interface defines what a queue needs to do:

abstract class Queue<E> {
  bool enqueue(E element);
  E? dequeue();
  bool get isEmpty;
  E? get peek;
}

These are the meanings of the core operations:

  • enqueue: Insert an element at the back of the queue. Return true if the operation was successful.
  • dequeue: Remove the element at the front of the queue and return it.
  • isEmpty: Check if the queue is empty.
  • peek: Return the element at the front of the queue without removing it.

Notice that the queue only cares about removal from the front and insertion at the back. You don’t need to know what the contents are in between. If you did, you would probably just use a list.

Go ahead and open the starter project. Add a file called queue.dart to the lib folder. Then add the Queue abstract class from the beginning of this section to the top of the file. You’ll use this interface later in the chapter when implementing a queue.

Note

Normally it doesn’t matter if you start with any fresh Dart project, but in this chapter, the starter project contains some additional data structure classes that you’ll use later on. So you’ll have an easier time if you actually do use the starter project. If you’re using DartPad, you’ll need to copy those classes to the bottom of the code window.

Example of a Queue

The easiest way to understand how a queue works is to see an example. Imagine a group of people waiting in line to buy a movie ticket.

BrianSamMicVickiRayfrontbackisEmpty = falseenqueueSamRayBrianMicdequeue

The queue currently holds Ray, Brian, Sam and Mic. Once Ray has received his ticket, he moves out of the line. By calling dequeue, Ray is removed from the front of the queue.

Calling peek will return Brian since he is now at the front of the line.

Now comes Vicki, who just joined the line to buy a ticket. By calling enqueue('Vicki'), Vicki gets added to the back of the queue.

In the following sections, you’ll learn to create a queue using four different internal data structures:

  • List
  • Doubly linked list
  • Ring buffer
  • Two stacks

List-Based Implementation

The Dart core library comes with a set of highly optimized data structures that you can use to build higher-level abstractions. One of them that you’re already familiar with is List, the data structure that stores a contiguous, ordered collection of elements. In this section, you’ll use a list to create a queue.

backfrontRayBrianSamA simple Dart list can be used to model the queue.

In lib/queue.dart, add the following code below your Queue interface:

class QueueList<E> implements Queue<E> {
  final _list = <E>[];

  @override
  bool enqueue(E element) => throw UnimplementedError();

  @override
  E? dequeue() => throw UnimplementedError();

  @override
  bool get isEmpty => throw UnimplementedError();

  @override
  E? get peek => throw UnimplementedError();
}

This sets up a private list to hold the elements of the queue. You’ve also added the methods required by the Queue interface that you defined earlier. Trying to access them now would throw an UnimplementedError, but you’ll implement them in the following sections.

Leveraging Lists

Replace isEmpty and peek in your QueueList with the following:

@override
bool get isEmpty => _list.isEmpty;

@override
E? get peek => (isEmpty) ? null : _list.first;

Using the features of List, you get the following for free:

  1. Check if the queue is empty.
  2. Return the element at the front of the queue, or null if the queue is empty.

These operations are both O(1).

Enqueue

Enqueuing an element at the back of the queue is easy. Just add an element to the list. Replace enqueue with the following:

@override
bool enqueue(E element) {
  _list.add(element);
  return true;
}

Enqueuing an element is, on average, an O(1) operation. This is because the list has empty space at the back.

backfrontRayBrianMicSamenqueue ('Mic')

In the example above, notice that, once you add Mic, the list has two empty spaces.

After adding multiple elements, the list will eventually be full. When you want to use more than the allocated space, the list must resize to make additional room.

backfrontRayBrianSamMicVickiGregList is fullEricRayBrianSamMicVickiGregenqueue ('Eric')

As a review of what you learned in an earlier chapter, appending to a list is an O(1) operation even though sizing is an O(n) operation. Resizing, after all, requires the list to allocate new memory and copy all existing data over to the new list. The key is that this doesn’t happen very often. This is because the capacity doubles each time it runs out of space. As a result, if you work out the amortized cost of the operation (the average cost), enqueuing is only O(1). That said, the worst-case performance is O(n) when the copy is performed.

Dequeue

Removing an item from the front requires a bit more work. Replace dequeue with the following:

@override
E? dequeue() => (isEmpty) ? null : _list.removeAt(0);

If the queue is empty, dequeue simply returns null. If not, it removes the element from the front of the list and returns it.

dequeue ('Ray')backfrontBrianSamMic

Removing an element from the beginning of a list is always a linear-time operation because it requires all the remaining elements in the list to be shifted in memory.

Testing the List-Based Implementation

Add a new method to override toString in QueueList so that you can see the results of your operations:

@override
String toString() => _list.toString();

Then open bin/starter.dart and add the following code to main:

final queue = QueueList<String>();
queue.enqueue('Ray');
queue.enqueue('Brian');
queue.enqueue('Eric');
print(queue);

queue.dequeue();
print(queue);

queue.peek;
print(queue);

You’ll need to import your queue.dart file:

import 'package:starter/queue.dart';

Change starter to whatever your project name is if you’re using something else besides the starter project that came with this book.

Run the code and you should see the following in the console:

[Ray, Brian, Eric]
[Brian, Eric]
[Brian, Eric]

This code puts Ray, Brian and Eric in the queue, then removes Ray and peeks at Brian but doesn’t remove him.

Performance

Here’s a summary of the algorithmic and storage complexity of the list-based queue implementation:

OperationsAverage caseWorst caseenqueueO(1)O(n)dequeueO(n)O(n)Space ComplexityO(n)O(n)List-Based Queue

You’ve seen how easy it is to implement a list-based queue by leveraging a Dart List. Enqueue is on average very fast, thanks to the O(1) add operation. There are some shortcomings to the implementation, though. Removing an item from the front of the queue can be inefficient, as removal causes all elements to shift by one. This makes a difference for very large queues. Once the list gets full, it has to resize and may have unused space. This could increase your memory footprint over time.

Is it possible to address these shortcomings? Compare this one to the linked-list-based implementation in the next section.

Doubly Linked List Implementation

Open the lib folder you’ll find a file called doubly_linked_list.dart that contains a DoublyLinkedList class. You should already be familiar with linked lists from Chapter 5, “Linked Lists”. A doubly linked list is simply a linked list in which nodes also contain a reference to the previous node.

Note

Feel free to use the singly linked list you made in Chapter 5 if you prefer. Just remember the performance characteristics, though. Since removeLast is O(n), you should avoid using that method. However, you can enqueue with append and dequeue with pop, both of which are O(1). For a doubly linked list it doesn’t matter as much since removeLast is also O(1).

Start by adding a generic QueueLinkedList to queue.dart as shown below.

First, import doubly_linked_list.dart at the top of the file:

import 'doubly_linked_list.dart';

Then, add the following code after the QueueList class.

class QueueLinkedList<E> implements Queue<E> {
  final _list = DoublyLinkedList<E>();

  @override
  bool enqueue(E element) => throw UnimplementedError();

  @override
  E? dequeue() => throw UnimplementedError();

  @override
  bool get isEmpty => throw UnimplementedError();

  @override
  E? get peek => throw UnimplementedError();
}

This implementation is similar to QueueList, but instead of using List, the internal data structure is DoublyLinkedList. Take a minute to browse the source code of DoublyLinkedList and compare it to the LinkedList you made earlier.

Next, you’ll start implementing the methods of the Queue interface.

Enqueue

To add an element to the back of the queue, simply replace enqueue with the following:

@override
bool enqueue(E element) {
  _list.append(element);
  return true;
}

Behind the scenes, the doubly linked list will update the tail node’s previous and next references to the new node. This is an O(1) operation.

RayBrianSamMicheadnextpreviousVickienqueue('Vicki')tail

Dequeue

To remove an element from the queue, replace dequeue with the following:

@override
E? dequeue() => _list.pop();

pop does exactly what you want dequeue to do, so you can use it directly.

Removing from the front of a linked list is also an O(1) operation. Compared to the List implementation, you don’t have to shift elements one by one. Instead, as shown in the diagram below, you simply update the pointers for the first two nodes of the linked list:

RayBrianSamMicheaddequeue ('Ray')nextprevVickitail

Checking the State of a Queue

Similar to the List implementation, you can implement peek and isEmpty using the properties of DoublyLinkedList.

Replace isEmpty and peek with the following:

@override
bool get isEmpty => _list.isEmpty;

@override
E? get peek => _list.head?.value;

Testing the Linked-List-Based Implementation

Override toString in QueueLinkedList so that you can see the results of your operations:

@override
String toString() => _list.toString();

Then replace the contents of main with the following code:

final queue = QueueLinkedList<String>();
queue.enqueue('Ray');
queue.enqueue('Brian');
queue.enqueue('Eric');
print(queue); // [Ray, Brian, Eric]

queue.dequeue();
print(queue); // [Brian, Eric]

queue.peek;
print(queue); // [Brian, Eric]

Run the code and you’ll see the same results as your QueueList implementation:

[Ray, Brian, Eric]
[Brian, Eric]
[Brian, Eric]

Performance

Here is a summary of the algorithmic and storage complexity of the doubly-linked-list-based queue implementation.

OperationsAverage caseWorst caseenqueueO(1)O(1)dequeueO(1)O(1)Space ComplexityO(n)O(n)Linked-List Based Queue

One of the main problems with QueueList was that dequeuing an item took linear time. With the linked list implementation, you reduced it to a constant operation, O(1). All you needed to do was update the node’s pointers.

The main weakness with QueueLinkedList is not apparent from the table. Despite O(1) performance, it suffers from high overhead. Each element has to have extra storage for the forward and back references. Moreover, every time you create a new element, it requires a relatively expensive dynamic allocation of memory for the new node. By contrast, QueueList does bulk allocation, which is faster.

Can you eliminate allocation overhead and maintain O(1) dequeues? If you don’t have to worry about your queue ever growing beyond a maximum fixed size, then the answer is yes! What you are looking for is a ring buffer. For example, you might have a game of Monopoly with four players. You can use a ring-buffer-based queue to keep track of whose turn is coming up next. You’ll take a look at the ring buffer implementation next.

Ring Buffer Implementation

A ring buffer, also known as a circular buffer, is a fixed-size list. This data structure strategically wraps around to the beginning when there are no more items to remove at the end.

Example

What follows is a simple example of how a queue can be implemented using a ring buffer:

WriteRead

You first create a ring buffer that has a fixed size of 4. The ring buffer has two pointers that keep track of two things:

  1. The read pointer keeps track of the front of the queue.
  2. The write pointer keeps track of the next available slot so that you can overwrite existing elements that have already been read.

The image below shows the read and write pointers after you enqueue an item:

ReadWriteChris

Each time that you add an item to the queue, the write pointer increments by one. Here is what it looks like after adding a few more elements:

ReadWriteChrisPabloManda

Notice that the write pointer moved two more spots and is ahead of the read pointer. This means that the queue is not empty.

Next, dequeue two items:

ChrisPabloMandaReadWrite

Dequeuing is the equivalent of reading a ring buffer. Notice how the read pointer moved twice.

Now, enqueue one more item:

ChrisPabloMandaVickiWriteRead

Since the write pointer reached the end, it simply wraps around to the starting index again. This is why the data structure is known as a circular buffer.

Finally, dequeue the two remaining items:

ChrisPabloMandaVickiWriteRead

The read pointer wraps to the beginning, as well. Whenever the read and write pointers are at the same index, that means the queue is empty.

Note

You might wonder what happens if you enqueue too many items and the write pointer loops around and catches up with the read pointer. One option is to throw an error. Alternatively, you could overwrite the unread data and push the read pointer ahead of the write pointer. The ring buffer in the starter project implements the first option, but you could modify it for the second option if losing data at the front of the queue is acceptable in your application.

Implementation

Now that you have a better understanding of how ring buffers can be used to make a queue, it’s time to implement one!

You’ll find a file called ring_buffer.dart in the lib folder of the starter project. This file includes the RingBuffer implementation that you’ll use in the next section.

For the adventurous: If you’d like a little extra challenge before proceeding, try to implement a ring buffer yourself based on the description above. Check the source code in the starter project if you get stuck.

Add a generic QueueRingBuffer to queue.dart as shown below:

import 'ring_buffer.dart';

class QueueRingBuffer<E> implements Queue<E> {
  QueueRingBuffer(int length)
    : _ringBuffer = RingBuffer<E>(length);

  final RingBuffer<E> _ringBuffer;

  @override
  bool enqueue(E element) => throw UnimplementedError();

  @override
  E? dequeue() => throw UnimplementedError();

  @override
  bool get isEmpty => _ringBuffer.isEmpty;

  @override
  E? get peek => _ringBuffer.peek;
}

There are a couple of points to pay attention to:

  • You must include a length parameter since the ring buffer has a fixed size.
  • isEmpty and peek are already implemented. Both of these are O(1) operations.

You’ll implement enqueue and dequeue in the following sections.

Enqueue

Replace enqueue with the method below:

@override
bool enqueue(E element) {
  if (_ringBuffer.isFull) {
    return false;
  }
  _ringBuffer.write(element);
  return true;
}

To append an element to the queue, you simply call write on the _ringBuffer. This increments the write pointer by one.

Since the queue has a fixed size, you must now return true or false to indicate whether the element has been successfully added. enqueue is still an O(1) operation.

Dequeue

Next replace dequeue with the following:

@override
E? dequeue() => _ringBuffer.read();

To remove an item from the front of the queue, you simply call read on the _ringBuffer. Behind the scenes, it checks if the ring buffer is empty and, if so, returns null. If not, it returns the item at the read index of the buffer and then increments the index by one.

Testing the Ring-Buffer-Based Implementation

Override toString in QueueRingBuffer so that you can see the results of your operations:

@override
String toString() => _ringBuffer.toString();

This creates a string representation of Queue by delegating to the underlying ring buffer.

That’s all there is to it! Test your ring-buffer-based queue by running the following code in main:

final queue = QueueRingBuffer<String>(10);
queue.enqueue("Ray");
queue.enqueue("Brian");
queue.enqueue("Eric");
print(queue); // [Ray, Brian, Eric]

queue.dequeue();
print(queue); // [Brian, Eric]

queue.peek;
print(queue); // [Brian, Eric]

This test code works just like the previous examples, dequeuing Ray and peeking at Brian.

Performance

How does the ring-buffer implementation compare? Have a look at a summary of the algorithmic and storage complexity.

OperationsAverage caseWorst caseenqueueO(1)O(1)dequeueO(1)O(1)Space ComplexityO(n)O(n)Ring-Buffer Based Queue

The ring-buffer-based queue has the same time complexity for enqueue and dequeue as the linked-list implementation. The space complexity for a ring-buffer-based queue, while still O(n), doesn’t require new memory allocation internally when enqueuing new elements like the linked-list implementation does. However, the ring buffer has a fixed size, which means that enqueue can fail.

So far, you’ve seen three implementations of a queue: a simple list, a doubly linked list and a ring buffer. These were all useful in their own ways, but next you’ll make a queue implemented with two stacks. Its spatial locality is superior to the linked list, and it also doesn’t need a fixed size like a ring buffer.

Double-Stack Implementation

Add a generic QueueStack to queue.dart as shown below:

class QueueStack<E> implements Queue<E> {
  final _leftStack = <E>[];
  final _rightStack = <E>[];

  @override
  bool enqueue(E element) => throw UnimplementedError();

  @override
  E? dequeue() => throw UnimplementedError();

  @override
  bool get isEmpty => throw UnimplementedError();

  @override
  E? get peek => throw UnimplementedError();
}

You’re using lists rather than the Stack class that you made in Chapter 4, “Stacks”. The reason for this is that you’re going to leverage a few functions of List that your Stack doesn’t currently have: first, last and reverse.

The idea behind using two stacks is simple. Whenever you enqueue an element, it goes in the right stack.

Left StackDequeueEnqueue32Right Stack1

When you need to dequeue an element, you reverse the right stack, place it in the left stack, and remove the top element.

Left StackDequeueEnqueue32Right Stack1

This reversing operation is only required when the left stack is empty, making most enqueue and dequeue operations constant-time.

Leveraging Lists

Implement the common features of a queue, starting with the following:

@override
bool get isEmpty => _leftStack.isEmpty && _rightStack.isEmpty;

To check if the queue is empty, simply check that both the left and right stacks are empty. This means that there are no elements left to dequeue and no new elements have been enqueued.

Next, replace peek with the following:

@override
E? get peek => _leftStack.isNotEmpty
    ? _leftStack.last
    : _rightStack.first;

You know that peeking looks at the top element. If the left stack is not empty, the element on top of this stack is at the front of the queue. If the left stack is empty, the right stack will be reversed and placed in the left stack. In this case, the element at the bottom of the right stack is next in the queue.

Note that the two properties isEmpty and peek are still O(1) operations.

Enqueue

Next replace enqueue with the method below:

@override
bool enqueue(E element) {
  _rightStack.add(element);
  return true;
}

Recall that the right stack is used to enqueue elements.

You simply push to the stack by appending to the list. From implementing the QueueList previously, you know that appending an element is an O(1) operation.

Left StackDequeue4EnqueueRight Stack123

Dequeue

Removing an item in a two-stack-based implementation of a queue is tricky. Add the following method:

@override
E? dequeue() {
  if (_leftStack.isEmpty) {
    // 1
    _leftStack.addAll(_rightStack.reversed);
    // 2
    _rightStack.clear();
  }
  if (_leftStack.isEmpty) return null;
  // 3
  return _leftStack.removeLast();
}

The following explanations refer to the numbered comments in the code above:

  1. If the left stack is empty, set it as the reverse of the right stack:

Left StackDequeueDequeue Step 1123EnqueueRight Stack4

  1. Invalidate your right stack. Since you have transferred everything to the left, just clear the right:

Left StackDequeueDequeue Step 2Enqueue432Right Stack1

  1. Remove the last element from the left stack:

Left StackDequeueDequeue Step 3Enqueue432Right Stack1

Remember, you only transfer the elements in the right stack when the left stack is empty!

Note

Yes, reversing the contents of a list is an O(n) operation. However, the overall dequeue cost is still amortized O(1). Imagine having a large number of items in both the left and right stacks. The reverse copy is only required infrequently when the left stack becomes empty.

Testing the Double-Stack-Based Implementation

As usual, override toString in QueueStack so that you can print the results:

@override
String toString() {
  final combined = [
    ..._leftStack.reversed,
    ..._rightStack,
  ].join(', ');
  return '[$combined]';
}

Here, you simply combine the reverse of the left stack with the right stack using the spread operator.

Try out the double-stack implementation:

final queue = QueueStack<String>();
queue.enqueue("Ray");
queue.enqueue("Brian");
queue.enqueue("Eric");
print(queue); // [Ray, Brian, Eric]

queue.dequeue();
print(queue); // [Brian, Eric]

queue.peek;
print(queue); // [Brian, Eric]

Just like all of the examples before, this code enqueues Ray, Brian and Eric, dequeues Ray and then peeks at Brian.

Performance

Here is a summary of the algorithmic and storage complexity of your two-stack-based implementation.

OperationsAverage caseWorst caseenqueueO(1)O(n)dequeueO(1)O(n)Space ComplexityO(n)O(n)Double Stack Based Queue

Compared to the list-based implementation, by leveraging two stacks, you were able to transform dequeue into an amortized O(1) operation.

Moreover, your two-stack implementation is fully dynamic and doesn’t have the fixed size restriction that your ring-buffer-based queue implementation has. Worst-case performance is O(n) when the right queue needs to be reversed or runs out of capacity. Running out of capacity doesn’t happen very often thanks to the fact that Dart doubles the capacity every time.

Finally, it beats the linked list in terms of spatial locality. This is because list elements are next to each other in memory blocks. So a large number of elements will be loaded in a cache on first access. Even though a list requires O(n) for simple copy operations, it’s a very fast O(n) happening close to memory bandwidth.

Compare the two images below:

A list has its data stored contiguously in memory.

135246

The data for a linked list, on the other hand, could be all over the place. This non-locality could lead to more cache misses, which will increase access time.

123456

Challenges

Think you have a handle on queues? In this section, you’ll explore four different problems related to queues. They’ll serve to solidify your fundamental knowledge of data structures in general. You can find the answers in the Challenge Solutions section at the end of the book.

Challenge 1: Stack vs. Queue

Explain the difference between a stack and a queue. Provide two real-life examples for each data structure.

Challenge 2: Step-by-Step Diagrams

Given the following queue where the left is the front of the queue and the right is the back:

SPEED

Provide step-by-step diagrams showing how the following series of commands affects the queue internally:

queue.enqueue('D');
queue.enqueue('A');
queue.dequeue();
queue.enqueue('R');
queue.dequeue();
queue.dequeue();
queue.enqueue('T');

Do this for each of the following queue implementations:

  1. List
  2. Linked list
  3. Ring buffer
  4. Double stack

Assume that the list and ring buffer have an initial size of 5.

Challenge 3: Whose Turn Is It?

Imagine that you are playing a game of Monopoly with your friends. The problem is that everyone always forgets whose turn it is! Create a Monopoly organizer that always tells you whose turn it is. Below is an extension method that you can implement:

extension BoardGameManager<E> on QueueRingBuffer {
  E? nextPlayer() {
    // TODO
  }
}

Challenge 4: Double-Ended Queue

A doubled-ended queue — a.k.a. deque — is, as its name suggests, a queue where elements can be added or removed from the front or back.

  • A queue (FIFO order) allows you to add elements to the back and remove from the front.
  • A stack (LIFO order) allows you to add elements to the back, and remove from the back.

Deque can be considered both a queue and a stack at the same time.

Your challenge is to build a deque. Below is a simple Deque interface to help you build your data structure. The enum Direction describes whether you are adding or removing an element from the front or back of the deque. You can use any data structure you prefer to construct a Deque.

enum Direction {
  front,
  back,
}

abstract class Deque<E> {
  bool get isEmpty;
  E? peek(Direction from);
  bool enqueue(E element, Direction to);
  E? dequeue(Direction from);
}

Key Points

  • Queue takes a FIFO strategy: an element added first must also be removed first.
  • Enqueue adds an element to the back of the queue.
  • Dequeue removes the element at the front of the queue.
  • Elements in a list are laid out in contiguous memory blocks, whereas elements in a linked list are more scattered with the potential for cache misses.
  • A ring-buffer-based implementation is good for queues with a fixed size.
  • Compared to a single list-based implementation, leveraging two stacks improves the dequeue time complexity to an amortized O(1) operation.
  • The double-stack implementation beats out linked-list in terms of spatial locality.