Computer Science / Software Engineering Notes Network

Algorithmics

Good luck today!🍀

Matthew Barnes & Stanley Modrák (last revised 2021)

Contents

Know how long a program takes        6

Travelling Salesperson Problem        6

Sorting        6

Running time        6

Time complexity        7

Functions comparison        8

Big-O (O)        8

Big-Omega (Ω)        9

Big-Theta (ϴ)        9

Disadvantages of Big-Theta Notation        10

Declare your intentions (not your actions)        10

ADTs        10

Stacks        10

Queues        11

Priority queues        12

Lists        12

Sets        12

Maps        13

Linked lists        13

Point to where you are going: links        13

Contiguous data structures        13

Arrays        13

Variable-length Arrays        14

Non-contiguous data structures        14

Linked lists        14

Singly Linked        14

Doubly Linked        15

Skip Lists        15

Recurse!        15

What is recursion (Divide-and-Conquer)?        16

Induction        16

Factorial        16

Analysis        16

Integer power        17

Analysis        17

GCD        18

Fibonacci sequence        18

Towers of Hanoi        19

Analysis        19

Cost        20

Make friends with trees        21

Trees        21

Levels        21

Binary trees        21

Implementation        21

Binary Search Trees        22

Comparable interface        23

Search        23

Sets        23

Tree iterators        23

Keep trees balanced        24

Tree depth        24

Complete trees        24

Rotations        24

AVL trees        27

Performance        29

Red-black trees        29

Performance        30

TreeSet        30

TreeMap        30

Sometimes it pays not to be binary        31

Multiway trees        31

B-Trees        31

B+ Trees        32

Tries (Prefix Trees)        33

Suffix trees        33

References        34

Use heaps!        34

Heaps        34

Addition        35

Removal        35

Array implementation        36

Implementation        37

Time complexity        38

Priority queues        39

Heap Sort        39

Standard Heap Sort        40

Other heaps        40

Make a hash of it        41

Hash functions        41

Collision resolution        41

Separate chaining        42

Open addressing/hashing        42

Linear probing        42

Quadratic probing        43

Double hashing        43

Removing items        43

Lazy remove        44

The Union-Find Problem        45

Example: dynamic connectivity        45

Equivalence classes, relations        45

Union-Find        46

Quick-Find        46

Implementation        47

Quick-Union        48

Implementation        49

Improvements        50

1. Weighted Quick-Union        50

2. Path compression        51

Use them together        51

Analyse!        52

Algorithm Analysis        53

Search        53

Sequential search        53

Binary search        54

Simple Sort        56

Bubble Sort        56

Insertion Sort        58

Selection Sort        59

Lower Bound        59

Sort wisely        59

In-place and Stable        60

In-place        60

Stable        60

Merge sort        60

Merge Sort with Insertion Sort        64

Quick sort        64

Quicksort with Insertion Sort        66

Shuffling        66

Knuth/Fisher-Yates Shuffle        66

Partitioning        66

Lomuto scheme        67

Hoare scheme        67

Multi-pivoting        67

Dual-pivoting        68

3-pivot        68

Radix Sort        68

Radix Sort with Counting Sort        72

Counting Sort        72

Think graphically        73

Graph Theory        73

Graphs        73

Connected graphs        73

Paths, walks and cycles        74

Trees        74

Planar graphs        74

Simple graphs        74

Terminology & theorems        75

Vertex degree        76

Trees        76

Applications of Graph Theory        76

Implementations        76

Adjacency matrix        77

Adjacency list        77

Graph traversal        77

Breadth First Search (BFS)        78

Depth First Search (DFS)        79

Walk enumeration        79

Number of  △s in an undirected graph        80

Degree matrix        80

Laplacian matrix        80

Famous Graph Theory problems        81

Shortest path problems        81

Minimum spanning tree problems        81

Graph partitioning        81

Graph isomorphism        81

Vertex cover        81

Graph colouring        81

Eulerian graphs        82

Determination        82

Fleury’s algorithm        82

The Seven Bridges of Königsberg problem        83

Hamiltonian graphs        83

Learn to traverse graphs        83

Depth First Search        83

Breadth First Search        84

Graph (greedy) algorithms        85

Minimum Spanning Tree        86

Prim’s Algorithm        86

Kruskal’s Algorithm        90

Shortest Path        92

Dijkstra’s Algorithm        92

Bellman-Ford Algorithm        95

Shortest Path on DAGs        95

Dynamic programming        95

Dynamic Programming        95

Principle of Optimality        97

Applications        97

Dijkstra’s Algorithm        97

Proof that shortest path problems have principle of optimality        97

Edit Distance        98

Limitation        99

Know how to search        99

Search Trees        99

Backtracking        100

Branch and Bound        101

Search in AI        103

Settle for good solutions        103

Heuristic Search        103

Constructive algorithms        104

Neighbourhood search        104

Simulated Annealing        104

Evolutionary Algorithms        107

Single-point crossover        108

Multi-point crossover        109

Uniform crossover        109

Bit simulated crossover        109

Galinier and Hao’s crossover operator        110

Tabu search        112

Use linear programming        113

What is linear programming        113

The three conditions        113

Example        113

Maximise function        113

Constraint 1        113

Constraint 2        113

Constraint 3        113

Solve the problem        113

Direction that minimises the cost and the constraints        115

Normal form        115

Solving linear programs        116

Simplex Method        116

Basic Feasible Solutions (not needed for exam, but interesting)        116

Know how long a program takes

Travelling Salesperson Problem

Sorting

Running time

Time complexity

Syntax

Description

1

Constant

log(n)

Logarithmic

n

Linear

n log(n)

Quasilinear/ Log-linear

n2

Quadratic

nk

Polynomial

kn

Exponential

n!

Factorial

Functions comparison

Big-O (O)

Big-Omega (Ω)

Big-Theta (ϴ)

Disadvantages of Big-Theta Notation

Declare your intentions (not your actions)

ADTs

Stacks

Queues

Priority queues

Lists

Sets

Maps

Linked lists

Point to where you are going: links

Contiguous data structures

Arrays

Variable-length Arrays

Non-contiguous data structures

Linked lists

Singly Linked

Doubly Linked

        

Skip Lists

Recurse!

What is recursion (Divide-and-Conquer)?

Induction

Factorial

Analysis

T(n) = T(n-1) + 1 for n > 0, T(0) = 0

then

T(n) = T(n-1) + 1 = T(n-2) + 2 = … = T(0) + n = n = Θ(n)

Integer power

Analysis

GCD

do gcd(A,B) = gcd(B,A mod B)

until gcd(A,0) = A

gcd(70,25) = gcd(25,20) = gcd(20,5) = gcd(5,0) = 5

Fibonacci sequence

Towers of Hanoi

   

hanoi(n, A, B, C) {
   if (n>0) {
     hanoi(n
-1, A, C, B);
     
move(A, C);
     hanoi(n
-1, B, A, C);
  }
}

Analysis

Cost


Make friends with trees

Trees

Levels

Binary trees

Implementation

Binary Search Trees

Comparable interface

Search

Sets

Tree iterators

Keep trees balanced

Tree depth

Complete trees

Rotations

Step 1 - Start

Step 2 - Make A’s right child equal to C’s left child

Step 3 - Make C’s left child equal to A

Step 4 - Finish

Step 1 - Start

Step 2 - Make A’s left child equal to B’s right child

Step 3 - Make B’s right child equal to A

Step 4 - Finish

AVL trees

Performance

Red-black trees

  1. the inserted element is the root; in which case, the inserted element is set to black
  2. the inserted element’s parent is black, colour red (no problem here in this case)
  3. the inserted element’s parent is red and its uncle (parent’s sibling) is red; in which case, the parent and uncle are set to black and the grandparent is set to red
  4. the inserted element’s parent is red and its uncle is black (or non-existent); in which case, rotations are used to put the parent (or current?) node in the grandparent node’s place

Performance

TreeSet

TreeMap

Sometimes it pays not to be binary

Multiway trees

B-Trees

B+ Trees

Tries (Prefix Trees)

Suffix trees

References

Introduction of B-Tree

Compressed Tries

Use heaps!

Heaps

Addition

Step 1 - We want to add the number ‘6’

Step 2 - Append it to the next available space on the tree

Step 3 - compare - 6 is smaller than its parent 13, so swap 6 and 13

Step 4 - compare - 6 is smaller than its parent 12, so swap 6 and 12. 6 is bigger than its parent 2, so we’re finished.

Removal

  1. pop the root, this is the value we want to return

Step 1 - we want to get the highest priority element (smallest number)

Step 2 - the root will always have the smallest number, so pop the root off

  1. replace the root with the last element in the heap

Step 3 - we need to replace the root’s value with something

Step 4 - pop off the last element and put it at the root

  1. percolate the new root down the tree until the constraint is met

Step 5 - compare - 14 is bigger than its child 5, so swap 14 and 5.

Note: always swap with the smaller child!

Step 6 - compare - 14 is bigger than its child 7, so swap 14 and 7.

Step 7 - compare - 14 is bigger than its child 7, so swap 14 and 7. 14 no longer has children, so we’re finished.

Array implementation

Implementation

import java.util.*;

public class HeapPQ<T> implements PQ<T>

{

private List<T> list;

public HeapPQ(int initialCapacity)

{

list = new ArrayList<T>(initialCapacity);

}

public int size() { return list.size(); }

public boolean isEmpty() { return list.size()==0; }

public T getMin() { return list.get(0); }

public void add(T element)

{

list.add(element);

percolateUp();

}

private void percolateUp()

{

int child = list.size()-1; //set child to index of new el

while (child>0) { //not the root

int parent = (child-1)>>1; //floor((child −1)/2) bitshift

if (compare(child, parent) >= 0)

        //child value>= parent value

break;

swap(parent,child);

child = parent; //continue checking for the parent

}

}

public T removeMin()

{

T minElem = list.get(0); //save min value

//replace the root by last element

list.set(0, list.get(list.size()-1));

list.remove(list.size()-1); //remove last element

percolateDown(0); //fix heap property

return minElem;

}

private void percolateDown(int parent)

{

int child = (parent<<1) + 1; //2*parent+1

while (child < list.size()) { //left child exists

if (child+1 < list.size() && compare(child,child+1) > 0)

//right child exists and smaller than left

child++;

if (compare(child, parent)>=0)

        //smallest child above parent

break;

swap(parent, child);

parent = child;

child = (parent<<1) + 1; //continue percolating down

}

}

private int compare(int el1, int el2) { //trivial }

private void swap(int el1, int el2) { //trivial }

}

Time complexity

Priority queues

Heap Sort

  1. add all the elements to sort into a heap
  2. take out all of the elements from the heap

Standard Heap Sort

  1. start with a non-sorted array
  2. transform into a max-heap without using additional storage
  1. the resulting array by repeatedly removing the maximum from the current heap

Other heaps


Make a hash of it

Hash functions

Collision resolution

Separate chaining

Open addressing/hashing

Linear probing

Quadratic probing

address = h(x) + di where h(x) is the hash function and di = i2

Double hashing

address = h(x) + di where h(x) is the hash function and di = i · h2(x),

good choice is h2(x) = R - (x % R) where R is a prime smaller than the table size, h2(x) should not be a divisor of the table size (table size = prime)

Removing items

1: We’ve got a normal hash table, with 4 keys.

2: Now we’ve deleted key 2, making that location ‘null’.

3: Now the hash table is trying to find key ‘3’. Key ‘3’ maps to the location of key ‘1’, but that’s taken, so it’s moving across to find key ‘3’.

4: Uh oh! It’s spotted a null location, so now it thinks key ‘3’ doesn’t exist because it thinks it’s the end of the cluster, when really it’s not the end and key ‘3’ exists!

Lazy remove

1: We’ve got a normal hash table, with 4 keys.

2: Now we’ve deleted key 2, so let’s flag that location.

3: Now the hash table is trying to find key ‘3’. Key ‘3’ maps to the location of key ‘1’, but that’s taken, so it’s moving across to find key ‘3’.

4: It’s spotted the flagged location! Since it knows that something used to be here, it knows that this isn’t the end of the cluster, so now it’ll continue through and find key ‘3’.


The Union-Find Problem

Example: dynamic connectivity

Equivalence classes, relations

Union-Find

Quick-Find

Index

0

1

2

3

4

ID

1

1

2

3

4

Index

0

1

2

3

4

ID

2

2

2

3

4

Implementation

Quick-Union

 →

Index

0

1

2

3

4

ID

0

0

0

3

4

Index

0

1

2

3

4

ID

3

0

0

3

4

Implementation

Improvements

1. Weighted Quick-Union

2. Path compression

Use them together


Analyse!

Algorithm Analysis

Search

Sequential search

Step 1 - We want to search for ‘68’

10,78,23,68,33

Step 2 - Start at the beginning. Is this 68?

10,78,23,68,33

Step 3 - That’s not 68, so we move onto the next. Is this 68?

10,78,23,68,33

Step 4 - That’s not 68, so we move onto the next again. Is this 68?

10,78,23,68,33

Step 5 - Yet again, that’s not 68. Is the next one 68?

10,78,23,68,33

Step 6 - This is 68, so now we’ve found our element.

10,78,23,68,33

Binary search

Step 1 - We need to find the element ‘13’

5,13,29,34,58,60

Step 2 - We make the first and last elements ‘head’ and ‘tail’.

5,13,29,34,58,60

Step 3 - Now we take the midpoint. Is that less than, more than or equal to our desired element?

5,13,29,34,58,60

Step 4 - It’s more than our desired element, so we set that element as our tail pointer, effectively halving our list.

5,13,29,34,58,60

Step 5 - Now we take the midpoint again. Is this less than, more than or equal to our desired element?

5,13,29,34,58,60

Step 6 - This element is our desired element! We have now found it using binary search.

5,13,29,34,58,60


Simple Sort

Bubble Sort

Step 1 - We need to sort this list.

7,3,9,5

Step 2 - Can we flip 3 and 7? Yes, so flip them.

3, 7, 9, 5

Step 3 - Can we flip 7 and 9? No, so leave it.

3, 7, 9, 5

Step 4 - Can we flip 9 and 5? Yes, so flip.

3, 7, 5, 9

Step 5 - There was a swap back there, so we do it again. Flip 3 and 7? No.

3, 7, 5, 9

Step 6 - Flip 7 and 5? Yes

3, 5, 7, 9

Step 7 - Flip 7 and 9? No.

3, 5, 7, 9

Step 8 - There was a flip. Do it again. Flip 3 and 5? No

3, 5, 7, 9

Step 9 - Flip 5 and 7? No

3, 5, 7, 9

Step 10 - Flip 7 and 9? No

3, 5, 7, 9

Step 11 - There were no flips back there, so we’re done

3, 5, 7, 9


Insertion Sort

Step 1 - We need to sort this list.

7,3,9,5

Step 2 - We create the sorted list on the far left, with the first element being the only member of that list.

7,3,9,5

Step 3 - Now we move onto 3, which needs to be added to the sorted list. It is less than 7, so it goes before 7.

3,7,9,5

Step 4 - Now we move onto 9. 9 is bigger than 3 and 7, so it is appended to the end.

3,7,9,5

Step 5 - Now we move onto 5. 5 is smaller than 7 and 9, but bigger than 3, so it goes in-between those. The unsorted list is now empty, so the list has now been sorted.

3,5,7,9


Selection Sort

Step 1 - We need to sort this list.

7,3,9,5

Step 2 - We iterate through the list and find that ‘3’ is the smallest element. We append this to the sorted list.

7,3,9,5

Step 3 - We iterate through the list again and find that ‘5’ is the next smallest element.

3,7,9,5

Step 4 - The next smallest element is ‘7’.

3,5,7,9

Step 5 - The next smallest element is ‘9’.

3,5,7,9

Step 6 - The unsorted list is now empty, so we have now sorted the list.

3,5,7,9

Lower Bound

Sort wisely

Insertion vs Selection vs Bubble Sort

QuickSort vs Merge Sort

In-place and Stable

In-place

Stable

Merge sort

Step 1: Split up the list into singleton groups

6, 3, 2, 9, 5

Step 2: Compare the first two elements, and concatenate them.

3, 6, 2, 9, 5

Step 3: Compare the next two elements, and concatenate them.

3, 6, 2, 9, 5

Step 4: Keep grouping together the groups until you get one big group.

3, 6, 2, 5, 9

Step 5: Keep grouping together the groups until you get one big group.

2, 3, 5, 6, 9



Merge Sort with Insertion Sort

Quick sort

Step 1: Pick a pivot (in this case, we will pick the median. But how do you get the median if the list isn’t sorted yet? Typically in quick sort, the median pivot is the median of the left-most, right-most and middle elements of the list, in this case, the median of ‘6’, ‘5’ and ‘2’.)

6, 3, 2, 9, 5

Step 2: Move smaller elements to the left and bigger elements to the right.

3, 2, 5, 9, 6

Step 3: Split elements to the left and right of the pivot into sublists and perform the same operation again.

3, 2, 5, 9, 6

Step 4: Starting with the left sublist, pick a pivot.

3, 2, 5, 9, 6

Step 5: Move elements to the right and left of pivot.

2, 3, 5, 9, 6

Step 6: Split into sublists. This is a singleton list, so we can leave it.

2, 3, 5, 9, 6

Step 7: Moving onto the other sublist we left behind, pick a pivot.

2, 3, 5, 9, 6

Step 8: Move elements around pivot

2, 3, 5, 6, 9

Step 9: Split into sublists. This is a singleton, so we can leave it.

2, 3, 5, 6, 9

Step 10: We’ve gone through every sublist, so now, we’re finished.

2, 3, 5, 6, 9

Quicksort with Insertion Sort

Shuffling

Knuth/Fisher-Yates Shuffle

Partitioning

Lomuto scheme

Hoare scheme

Multi-pivoting

Dual-pivoting

3-pivot

Radix Sort

Steps

List

Buckets

Step 1: Look at the least significant digits (units) and add each element of the lists to each bucket depending on its least significant digit.

856

334

629

257

035

211

560

146

0

560

1

211

2

3

4

334

5

035

6

856, 146

7

257

8

9

629

Step 2: Iterate through the buckets, starting from 0 and going to 9, removing all the elements from each bucket and pushing them onto a new list. It’s not sorted yet, but as you can see, it’s sorted by its units only.

560

211

334

035

856

146

257

629

0

1

2

3

4

5

6

7

8

9

Step 3: Do the same thing as step 1, but do it for the next most significant digit (tens)

560

211

334

035

856

146

257

629

0

1

211

2

629

3

334, 035

4

146

5

856, 257

6

560

7

8

9

Step 4: Do the same thing as step 2: go through the buckets and push all the elements into a list.

211

629

334

035

146

856

257

560

0

1

2

3

4

5

6

7

8

9

Step 5: Do the same thing as step 1 and 3, but with the next most significant bit (hundreds)

211

629

334

035

146

856

257

560

0

035

1

146

2

211, 257

3

334

4

5

560

6

629

7

8

856

9

Step 6: After you push all the elements out of the buckets, you’ll have a sorted list. That’s because the hundreds is the most significant digit of the biggest number of the list.

035

146

211

257

334

560

629

856

0

1

2

3

4

5

6

7

8

9

Radix Sort with Counting Sort

Counting Sort


Think graphically

Graph Theory

Graphs

Connected graphs

Paths, walks and cycles

Trees

Planar graphs

Simple graphs

Terminology & theorems

and E={e1,…, ek , …, e,} is the set of edges

Q: is a square graph bipartite? yes

Vertex degree

Handshaking Lemma: ∑i d(vi) = 2|E|

i.e., the sum of the degrees of the vertices is always twice the number of edges.

-- Why? (because, every edge is counted twice , once for the one vertex and one for the other vertex)

⇒ the number of odd-degree vertices in a graph is always even

⇒ when k is odd, every k-regular graph must have an even number of vertices

Trees

Applications of Graph Theory

Implementations

Adjacency matrix

Adjacency list

Graph traversal

Breadth First Search (BFS)

Input: A graph G and a starting vertex root of G

Output: Goal state. The parent links trace the shortest path back to root

procedure BFS(G, root) is

    let Q be a queue

    label root as discovered

    Q.enqueue(root)

    while Q is not empty do

        v := Q.dequeue()

        if v is the goal then

            return v

        for all edges from v to w in G.adjacentEdges(v) do

            if w is not labeled as discovered then

                label w as discovered

                Q.enqueue(w)

Depth First Search (DFS)

Input: A graph G and a vertex v of G

Output: All vertices reachable from v labeled as discovered

recursive:

procedure DFS(G, v) is

    label v as discovered

    for all directed edges from v to w that are in G.adjacentEdges(v) do

        if vertex w is not labeled as discovered then

            recursively call DFS(G, w)

iterative:

procedure DFS_iterative(G, v) is

    let S be a stack

    S.push(v)

    while S is not empty do

        v = S.pop()

        if v is not labeled as discovered then

            label v as discovered

            for all edges from v to w in G.adjacentEdges(v) do 

                S.push(w)

Walk enumeration

Number of  △s in an undirected graph

  1. Write down the adjacency matrix A
  2. Calculate A3
  3. The number of triangles in Undirected Graph = trace(A3) / 6

*a triangle has three vertices and it is counted for every vertex, we need to divide

result by 3. Since the graph is undirected, every triangle twice as 1-2-3-1 and 1-3-2-1,

so we divide by 2 also.

Degree matrix

Def: D = (aij), where aii = di the degree of vertex vi; aij = 0 elsewhere

Laplacian matrix

Def: L = D – A. I.e., the difference between the Degree matrix and the Adjacency matrix

Famous Graph Theory problems

Shortest path problems

Minimum spanning tree problems

Graph partitioning

Graph isomorphism

Vertex cover

Graph colouring

Eulerian graphs

Determination

Fleury’s algorithm

To find an Euler path or an Euler circuit:

  1. Make sure the graph has either 0 or 2 odd vertices.
  2. If there are 0 odd vertices, start anywhere. If there are 2 odd vertices, start at one of them.
  3. Follow edges one at a time. If you have a choice between a bridge and a non-bridge, always choose the non-bridge (bridge - removing a single edge from a connected graph that makes it disconnected).
  4. Stop when you run out of edges.

The Seven Bridges of Königsberg problem

Can you walk through the city that would cross each of those bridges once and only once?

Hamiltonian graphs

Learn to traverse graphs

Depth First Search

Breadth First Search


Graph (greedy) algorithms

Minimum Spanning Tree

Prim’s Algorithm

Step 1: Pick an arbitrary node

Step 2: The only arc going from ‘T’ to an unconnected node is ‘A → B’, so we add that one.

Step 3: The two possible arcs to add are B → D and B → C. Both arcs will not break the conditions for a spanning tree, so we go for the one that has the smallest weight: B → C.

Step 4: The possible arcs are B → D, C → D and C → E. Yet again, none of them break the conditions for a spanning tree, so we go for C → D as it has the smallest weight.

Step 5: Now, the possible arcs are B → D, D → E and C → E. Notice how we cannot pick B → D, because that would break the conditions for a spanning tree (acyclic). That leaves D → E and C → E. These do not break any conditions, so we pick D → E based on weight.

Step 6: All nodes are now in our spanning tree, so we can stop.



Kruskal’s Algorithm

Step 1: The smallest arc that still keeps the conditions of a spanning tree is C → D.

Step 2: The next smallest arc is B → C.

Step 3: We can no longer pick B → D because that would break the conditions for a spanning tree. However, we now have a choice between B → A and D → E. We can pick any, because they’re the same weight. We’ll pick B → A.

Step 4: Now, the smallest arc is D → E. The reason why we didn’t just add B → A and D → E at the same time in Step 3 is because adding B → A could’ve made the arc D → E break a condition for the spanning tree. It didn’t, so we can add it now.

Step 5: Now that all of the nodes have been added, we can stop.


Shortest Path

Dijkstra’s Algorithm

Step 1: We want to find the shortest path from A to F. Each node stores a value and the node before it.

Step 2: We start at A, because A is the start node. A is adjacent to C and B, so we store the weights from A to B and A to C in B and C respectively, and we store the node ‘A’ into ‘B’ and C’.

Now we’re finished with A, and we can ‘out’ it. By ‘out’ing the node ‘A’, we lock it so we cannot change its value.

Step 3: Now, which nodes on the right have the smallest value and is not out? (It’s B)

The reason why we do this is because the smallest value tell us the shortest node from our shortest path.

Anyway, now we’ve selected B. B can now be appended to the shortest path, making it ‘AB’. Additionally, we can ‘out’ B.

The adjacent nodes are C and D. To find the new value of C, we add the current weight by the weight that connects our selected node and C. The current weight is 3, and the edge connecting B and C is 2, so our new value to C is 5. C’s value is already 5, so we don’t change it, but if the value was smaller, we would change it. Looking at D, the weight to D is 3 and our current weight is 3, so the value for D is 6.

Step 4: The next node is now C, as it is the smallest non-out node.

C has four adjacent nodes: A, B, D and E. We ignore A and B because they’re both ‘out’.

The new value of D will be 5 + 2 = 7, but the existing value of D is 6, which is smaller, so we don’t change it.

The new value of E will be 5 + 5 = 10, and E is currently infinity, so we change E’s value to 10.

Step 5: The next node is D.

The adjacent nodes of D is B, C, F and E. B and C are out, so we’re only interested in F and E.

The weight from D to F is 6, so the new value for F will be 6 + 6 = 12.

The weight from D to E is 4, so the new value for E will be 6 + 4 = 10. Since 10 is smaller than E’s current value 10, we update it.

Step 8: The next node is E.

The only adjacent node we’re interested in is F, because the rest are ‘out’.

The weight from E to F is 1, so the new value for F is 10 + 1 which is 11. This is smaller than F’s current value, so we update it.

Step 7: The next node is F.

Since we’ve now reached our goal, we can stop this process and start backtracking.

We start at F, and we can see we went from E to F. Then we go to E, and we can see we went from C to E, and we keep going like that until we reach the start node.

Shortest path from A to F is ACEF with a weight of 11.


Bellman-Ford Algorithm

Shortest Path on DAGs


Dynamic programming

Dynamic Programming

Principle of Optimality

Applications

Dijkstra’s Algorithm

Proof that shortest path problems have principle of optimality

Edit Distance

Limitation


Know how to search

Search Trees

Backtracking


Branch and Bound

Search in AI


Settle for good solutions

Heuristic Search

Constructive algorithms

Neighbourhood search

  1. Start with a solution; any solution
  2. Examine the neighbouring solutions
  3. Move to a neighbour if they’re better than the one we’re on
  4. Repeat step 2 until some stopping criteria

Simulated Annealing

Big β, low temperature

Small β, high temperature

Evolutionary Algorithms

  1. Initialise your population
  2. Evaluate fitness/cost
  3. Select new population based on fitness/cost
  4. Mutate members of population
  5. Crossover members of population
  6. Keep repeating steps 2 - 5 until satisfied
  7. Return best member of population

Before

After

Single-point crossover

Multi-point crossover

Uniform crossover

Bit simulated crossover

Galinier and Hao’s crossover operator

1

2

3

4

5

6

7

8

9

R

B

B

G

G

G

G

R

R

G

B

G

G

G

B

B

R

R

Parent 1

Parent 2

R

1, 8, 9

8, 9

G

4, 5, 6, 7

1, 3, 4, 5

B

2, 3

2, 6, 7

1

2

3

4

5

6

7

8

9

Parent 1

Parent 2

R

1, 8, 9

8, 9

G

1, 3

B

2, 3

2

1

2

3

4

5

6

7

8

9

R

R

R

R

Parent 1

Parent 2

R

1

G

1, 3

B

2, 3

2

1

2

3

4

5

6

7

8

9

R

R

R

R

G

G

Parent 1

Parent 2

R

1

G

1

B

1

2

3

4

5

6

7

8

9

B

B

R

R

R

R

G

G

1

2

3

4

5

6

7

8

9

G

B

B

R

R

R

R

G

G

Tabu search


Use linear programming

What is linear programming

The three conditions

  1. A linear objective function to maximise/minimise
  2. Non-negativity constraints (e.g. x1 ≥ 0)
  3. Additional linear constraints (e.g. 2x + 5y ≤ 56)

Example

Maximise function

Constraint 1

Constraint 2

Constraint 3

Solve the problem

  1. Both x and y must be integers on that point (because x and y can’t be fractions in the solution. If a problem does allow fractions, this can be ignored.)
  2. On the next point where the 1st condition will be true, it’ll be outside the unshaded area (basically, be as close to the right as possible)

Direction that minimises the cost and the constraints

Normal form


Solving linear programs

Simplex Method

Basic Feasible Solutions (not needed for exam, but interesting)