Lecture 39 - Final Exam Review
Welcome to Final Review Day!
You've learned an enormous amount this semester! Today we'll:
- Name (and sometimes review) ALL key concepts from the entire term
- Practice with quiz questions
- Focus on post-Midterm 2 material (algorithms & data structures)
- Build confidence for the final
Reminders about the final exam
- Tuesday December 17, 12:00-2:00pm
Exam format as it stands (subject to change):
- Fill-in questions: 40 pts (half Rust, half DS&A)
- Code tracing: 10 pts
- Shell/git: 8 pts
- Stack-heap: 10 pts
- Hand-coding: 18 pts (3 short problems)
- Complexity analysis: 16 pts
- Algorithm tracing: 28 pts
Total: 110 points
Tools & Basics (L2-12)
This material was covered on Midterms 1 and 2. We'll do a condensed review.
Shell commands you should know:
pwd,ls,ls -la,cd,mkdir,rm
Git workflow:
git clone,git status,git log,git add .,git commit -m "...",git push,git pull
Cargo commands:
cargo new,cargo run,cargo test,cargo check
Rust basics:
- Variables:
let x = 5(immutable),let mut x = 5(mutable) - Types:
i32,f64,bool,char,&str,String - Function signatures and return types
- Expressions vs statements
- Control flow:
if/else,for,while,loop - Enums and pattern matching:
match,Option<T>,Result<T, E> - Error handling with
panic!,?, andOption/Result
Memory & Ownership (L14-18)
Stack vs Heap:
- Stack: fixed size, fast, local variables
- Heap: dynamic size, slower, for
String,Vec,Box
Ownership rules:
- Each value has one owner
- When owner goes out of scope, value is dropped
- Stack types copy, heap types move by default (generally)
Borrowing:
&T- immutable reference (many allowed)&mut T- mutable reference (only one, no other borrows)
Strings:
String- owned, growable&str- borrowed slice, eg.let y = &text[0..3]- UTF-8 encoding (can't index with
text[0])
Collections & Advanced Rust (L19-26)
HashMap and HashSet:
- HashMap for key-value pairs (keys must implement
HashandEqtraits) - HashSet for unique values
- Hash functions: deterministic, fast, uniform distribution, hard to invert
Structs:
- Define custom types with named fields
- Methods:
&self(read),&mut self(modify),self(consume)
Generics & Traits:
<T>for generic types- Trait bounds:
T: Clone,T: PartialOrd - Common traits:
Debug,Clone,Copy,PartialEq,PartialOrd,Ord #[derive(...)]auto-generates trait implementations
Lifetimes:
'asyntax for lifetime annotations- Needed when multiple reference inputs and reference output
'staticmeans "lives for entire program"
Quick Questions: Rust Fundamentals
Question 1
You've made changes to several files and want to commit them. What's the correct sequence of git commands?
- A)
git commit -m "message"togit add .togit push - B)
git add .togit commit -m "message"togit push - C)
git pushtogit add .togit commit -m "message" - D)
git commit -m "message"togit pushtogit add .
Question 2
Will this compile? Why or why not?
#![allow(unused)] fn main() { let mut v = vec![1, 2, 3]; let r = &v; v.push(4); println!("{:?}", r); }
Question 3
What trait do you need to derive to print a struct with {:?}?
- A) Display
- B) Debug
- C) Print
- D) Clone
Packages & Testing (L27-28)
Modules & Packages:
modkeyword declares modulespubmakes items publicusebrings items into scope- Cargo workspace for multi-package projects
Testing:
#[test]marks test functionsassert!,assert_eq!,assert_ne!for testingcargo testruns tests#[should_panic]for tests that should panic
You don't need to know the finicky details of pub etc. in nested structures
Iterators & Closures (L29)
Iterators:
.iter()- borrows elements.into_iter()- takes ownership.iter_mut()- mutably borrows- Iterator methods:
map,filter,collect,sum,count,enumerate
Closures:
- Anonymous functions:
|x| x + 1 - Can capture environment
- Used with iterator methods
Concurrency - not on the exam!
Quick Questions: Advanced Rust
Question 4
What does this iterator chain return?
#![allow(unused)] fn main() { vec![1, 2, 3, 4, 5] .iter() .filter(|&x| x % 2 == 0) .map(|x| x * 2) .collect::<Vec<_>>() }
- A)
[2, 6, 10] - B)
[4, 8] - C)
[2, 4] - D)
[4]
Question 5
Fill in the blanks with the correct Rust keywords (mod, pub, use):
#![allow(unused)] fn main() { // Declare a new module called 'utils' _____ utils; // Make this function accessible from outside the module _____ fn helper() { } // Bring HashMap into scope _____ std::collections::HashMap; }
Big O Notation (L31)
Common complexities:
- O(1) - Constant time (array access, hash lookup)
- O(log n) - Logarithmic (binary search, balanced tree operations)
- O(n) - Linear (loop through array once)
- O(n log n) - Linearithmic (merge sort, good general-purpose sorting)
- O(n^2) - Quadratic (nested loops, bubble sort)
- O(2^n) - Exponential (recursive Fibonacci, bad!)
Rules:
- Drop constants:
2n->O(n) - Take worst term:
n^2 + n->O(n^2) - Analyze worst case (unless we say otherwise)
- Complexity of nested loops gets multiplied - sequential gets added
Space complexity:
- How much extra memory does algorithm use?
- Same notation: O(1), O(n), O(log n), etc.
Sorting Algorithms (L32)
| Algorithm | Best | Average | Worst | Space | Stable? |
|---|---|---|---|---|---|
| Selection Sort | O(n^2) | O(n^2) | O(n^2) | O(1) | No |
| Bubble Sort | O(n) | O(n^2) | O(n^2) | O(1) | Yes |
| Insertion Sort | O(n) | O(n^2) | O(n^2) | O(1) | Yes |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Quick Sort | O(n log n) | O(n log n) | O(n^2) | O(n) | No |
Stability: If two elements are equal, do they stay in original order?
When to use what:
- Small data or nearly sorted: Insertion sort
- Need guaranteed O(n log n): Merge sort
- Average case and in-place: Quick sort
Quick Questions: Big O & Sorting
Question 6
What's the time complexity of merge sort on an array that's already sorted of size n?
- A) O(1)
- B) O(log n)
- C) O(n)
- D) O(n log n)
Question 7
Which sorting algorithm has O(n^2) worst case but O(n log n) average case?
- A) Merge sort
- B) Quick sort
- C) Insertion sort
- D) Bubble sort
Question 7.5
What's the time complexity of this code?
#![allow(unused)] fn main() { fn process(data: &Vec<i32>) { for i in 0..data.len() { for j in i+1..data.len() { println!("{} {}", data[i], data[j]); } } } }
- A) O(1)
- B) O(n)
- C) O(n log n)
- D) O(n^2)
Stack, Queue, Deque (L33)
Stack (LIFO - Last In, First Out):
- Operations:
push(add to top),pop(remove from top),peek(look at top) - In Rust:
Vec<T> - Use cases: Function call stack, undo/redo, DFS, parsing
- All operations: O(1)
Queue (FIFO - First In, First Out):
- Operations:
enqueue(add to back),dequeue(remove from front) - In Rust:
VecDeque<T>(circular buffer) - Use cases: Task scheduling, BFS, buffering
- All operations: O(1)
Deque (Double-Ended Queue):
- Can add/remove from both ends
- In Rust:
VecDeque<T> - Use cases: Sliding window, palindrome checking
Why not Vec for queue?
vec.remove(0)is O(n) - must shift all elements!VecDequeuses circular buffer for O(1) front operations
LinkedList:
- Rarely used in Rust (ownership makes it complex)
- O(1) insert/delete at known position, O(n) random access
Quick Questions: Linear Structures
Question 8
Which data structure should you use for BFS (breadth-first search)?
- A) Stack (Vec)
- B) Queue (VecDeque)
- C) HashMap
- D) LinkedList
Question 9
Why is VecDeque better than Vec for implementing a queue?
- A) It uses less memory
- B) It can remove from front in O(1) instead of O(n)
- C) It's faster to create
- D) It can store more elements
Priority Queues & Heaps (L34)
Priority Queue: Get element with highest (or lowest) priority
- Not FIFO! Order by priority, not insertion time
Binary Heap: Complete binary tree with heap property
- Max-heap: Parent e both children (everywhere)
- Complete: All levels filled except possibly last (fills left-to-right)
Array representation:
- Store level-by-level in array
- Parent of
i:(i-1)/2 - Left child of
i:2*i + 1 - Right child of
i:2*i + 2
(don't memorize these - just remember they're in "reading order")
Operations:
push(insert): Add to end, bubble up - O(log n)pop(extract max/min): Remove root, replace with last, bubble down - O(log n)peek: Look at root - O(1)- Build heap from array: O(n) using special "heapify" algorithm (you don't need to know how it works)
Heap Sort:
- Build max-heap: O(n)
- Repeatedly extract max: O(n log n)
- Total: O(n log n) guaranteed, O(1) space
In Rust: BinaryHeap<T> (max-heap by default)
Quick Questions: Heaps
Question 10
In a max-heap array [42, 30, 25, 10, 20, 15], what are the children of element at index 1 (value 30)?
- A) 25 and 10
- B) 10 and 20
- C) 42 and 25
- D) 20 and 15
Question 11
Which operation is a binary heap optimized for?
- A) Finding any element by value
- B) Getting the max/min element
- C) Sorting all elements
- D) Finding the median
Binary Search Trees (L35)
Binary Search Tree (BST): Binary tree where:
- All values in left subtree < node value
- All values in right subtree > node value
This enables binary search!
Operations (balanced BST):
- Search: Compare and go left/right - O(log n)
- Insert: Search for position, add - O(log n)
- Delete: Three cases:
- No children: just remove
- One child: replace with child
- Two children: replace with in-order successor (smallest in right subtree)
- Time: O(log n)
- Find min/max: Go all the way left/right - O(log n)
BST vs Heap representation:
- Heap: Complete tree, use array, index arithmetic
- BST: NOT complete (has gaps), need pointers, recursive structure
Balance matters!
- Balanced: Height = O(log n), operations are O(log n)
- Degenerate Height = O(n), operations are O(n)
- Real implementations use more complex, self-balancing trees
Rust's BTreeMap and BTreeSet: Guaranteed O(log n) operations
BST vs Other Structures
| Operation | Sorted Array | BST (balanced) | Binary Heap |
|---|---|---|---|
| Search for value | O(log n) | O(log n) | O(n) |
| Insert | O(n) | O(log n) | O(log n) |
| Delete | O(n) | O(log n) | O(log n) |
| Find min/max | O(1) | O(log n) | O(1) |
| Get all sorted | O(1) | O(n) | O(n log n) |
(I really forgot to include these after some point...) Don't memorize this whole thing! In each case just think through what's going on and you don't have to memorize or guess.
Quick Questions: BST
Question 12
In this BST, if 3 gets deleted, what gets put in its place?
8
/ \
3 10
/ \ \
1 6 14
/ \
4 7
- A) 1
- B) 4
- C) 6
- D) 8
Question 13
What happens to BST operations if the tree becomes degenerate ?
- A) They become O(1)
- B) They stay O(log n)
- C) They become O(n)
- D) They become O(n^2)
Graph Basics & Traversal (L36)
Graph: Vertices (nodes) connected by edges
Types:
- Directed: Edges have direction (A -> B)
- Undirected: Edges are bidirectional (A <-> B)
- Weighted: Edges have costs/distances
- Unweighted: All edges equal
Representations:
- Adjacency matrix: 2D array,
matrix[i][j]= edge from i to j- Space: O(V^2), Good for dense graphs
- Adjacency list: Each vertex has list of neighbors
- Space: O(V + E), Good for sparse graphs (most real-world graphs)
BFS - Breadth-First Search
Uses a Queue (FIFO)
Algorithm:
- Start at source, mark visited, add to queue
- While queue not empty:
- Dequeue vertex
- For each unvisited neighbor:
- Mark visited, add to queue
Properties:
- Explores level by level
- Finds shortest path in unweighted graphs
- Time: O(V + E) (visit each vertex and edge once)
- Space: O(V) (queue and visited set)
Key use cases:
- Shortest path in unweighted graph
DFS - Depth-First Search
Uses a Stack (LIFO) - can be recursive or explicit stack
Algorithm:
- Start at source, mark visited
- For each unvisited neighbor:
- Recursively DFS from neighbor
- (Or use explicit stack: push start, while stack not empty, pop and explore)
Properties:
- Explores as deep as possible before backtracking
- Does NOT find shortest paths
- Time: O(V + E)
- Space: O(V) (recursion stack or explicit stack)
Key use cases:
- Topological sort
Quick Questions: Graphs
Question 14
What data structure does BFS use?
- A) Stack
- B) Queue
- C) Heap
- D) BST
Question 15
If you need to find the shortest path in an unweighted graph, which algorithm should you use?
- A) DFS
- B) BFS
- C) Dijkstra's
- D) Prim's
DAGs and Topological Sort (L37)
DAG (Directed Acyclic Graph): Directed graph with NO cycles
Examples:
- Course prerequisites
- Task dependencies
- Spreadsheet cell dependencies
Topological Sort: Linear ordering where all edges go left to right
- Only possible on DAGs!
- Multiple valid orderings may exist
Algorithm (DFS-based):
- Run DFS from all unvisited vertices
- Track finish times
- Reverse the finish order
- Time: O(V + E)
Use cases: Scheduling tasks with dependencies
Minimum Spanning Tree (MST)
Goal: Connect all vertices with minimum total edge weight
- Input: Undirected, weighted, connected graph
- Output: Tree (V-1 edges) connecting all V vertices with minimum sum of weights
Kruskal's Algorithm
Greedy approach: Sort edges, add cheapest that doesn't create cycle
Algorithm:
- Sort all edges by weight (increasing)
- For each edge (u, v):
- If adding it doesn't create cycle: add to MST
- Use Union-Find to detect cycles
- Stop when have V-1 edges
Time complexity: O(E log E) (dominated by sorting)
Prim's Algorithm
Greedy approach: Grow MST from starting vertex
Algorithm:
- Start from any vertex, add to MST
- Repeat until all vertices in MST:
- Find cheapest edge connecting MST to non-MST vertex
- Add that edge and vertex to MST
- Use priority queue (min-heap)
Time complexity: O(E log V) with binary heap
Quick Questions: Topological Sort and MST
Question 16
Which graph property is required for topological sort to exist?
- A) Connected
- B) Weighted
- C) Undirected
- D) Acyclic
Question 17
What's the output of an MST algorithm?
- A) Shortest path from source to all vertices
- B) A subgraph connecting all V vertices with minimum weight
- C) Topological ordering of vertices
- D) All cycles in the graph
Shortest Paths / Dijkstra's (L38)
Goal: Find shortest path from source to all vertices in weighted graph with non-negative edges
Greedy approach: Always process closest unvisited vertex
Algorithm:
- Initialize distances: source = 0, all others =
- Use min-heap (priority queue) of (distance, vertex)
- While heap not empty:
- Extract vertex u with minimum distance
- For each neighbor v:
- If
dist[u] + weight(u,v) < dist[v]:- Update
dist[v] - Add v to heap with new distance
- Track parent for path reconstruction
- Update
- If
Time complexity: O((V + E) log V) with binary heap
Key insight: Once a vertex is processed, we've found its shortest path (greedy choice is safe)
Limitations:
- Cannot handle negative edge weights! (Bellman-Ford can)
- Doesn't detect negative cycles
Path reconstruction:
- Track parent pointers while running
- Follow parents backward from destination to source
- Reverse to get forward path
Dijkstra vs BFS vs DFS
- Unweighted shortest path: BFS
- Weighted shortest path (non-negative): Dijkstra
- Negative weights: Bellman-Ford (not covered, but you should know Dijkstra can't handle it)
- Topological sort: DFS
- Exploring/iterating over the graph: DFS or BFS
Quick Questions: Shortest Paths
Question 18
What's the key requirement for Dijkstra's algorithm to work correctly?
- A) Graph must be directed
- B) Graph must be connected
- C) All edge weights must be non-negative
- D) Graph must be a DAG
Question 19
What data structure does Dijkstra's algorithm use to efficiently get the next closest vertex?
- A) Stack
- B) Queue
- C) Min-heap (priority queue)
- D) BST
Question 20
If you need to find the shortest path in an unweighted graph, which is most efficient?
- A) BFS
- B) DFS
- C) Dijkstra's
- D) Kruskal's
Summary Tables
(Includes amortized values where applicable)
| Structure | Access | Insert | Delete | Use Case |
|---|---|---|---|---|
| Vec | O(1) | O(1) back | O(1) back | Stack, random access |
| VecDeque | O(1) | O(1) both ends | O(1) both ends | Queue, deque |
| HashMap | O(1) | O(1) | O(1) | Key-value lookup |
| BinaryHeap | O(1) peek | O(log n) | O(log n) | Priority queue |
| BTreeMap | O(log n) | O(log n) | O(log n) | Sorted key-value |
| Algorithm | Type | Time | Data Structure | Use Case |
|---|---|---|---|---|
| Merge Sort | Sorting | O(n log n) | - | General-purpose, stable |
| Quick Sort | Sorting | O(n log n) avg | - | In-place, fast average |
| Heap Sort | Sorting | O(n log n) | Max heap / priority queue | Guaranteed, in-place |
| BFS | Graph traversal | O(V+E) | Queue | Shortest path (unweighted) |
| DFS | Graph traversal | O(V+E) | Stack (or recursion) | Exploration, topological sort |
| Topological Sort | Graph ordering | O(V+E) | Stack (via DFS) | DAG task scheduling |
| Kruskal's MST | Graph | O(E log E) | Union-Find | Minimum spanning tree |
| Prim's MST | Graph | O(E log V) | Min heap / priority queue | Minimum spanning tree |
| Dijkstra's | Shortest path | O(E log V) | Min heap / priority queue | Weighted shortest path |
Don't freak out and try to memorize it! See how many you can recall by reasoning through it.
Note - you are fine if you say O(E) instead of O(V+E) since E dominates V generally. Similarly for O(E log V) vs O((E+V) log V) for Dijkstra's... it's the rough scaling that matters here.
Tips for Hand-Coding Problems
Before you start:
- Read the problem carefully - what is the input type? What should be returned?
- Identify any required methods or constraints (e.g., "use
.filter(),.map(), and.collect()") - Consider edge cases (empty input, single element, etc.)
While coding:
- Write clean, readable code - you want partial credit even if it's not perfect
- Use descriptive variable names when possible
- Remember Rust syntax details:
&for references,mutfor mutability, type annotations - Don't panic if you forget exact syntax - show your logic clearly
Common patterns to remember:
- Iterator chain:
.iter()to.filter()/.map()to.collect() - Finding min/max: iterate and track current min/max or use
.min()and.max()with an iterator - Building new collections: create empty, then push/insert in a loop
Hand-Coding practice problem ideas
Basic:
- Given a vec, return a new vec with every-other element of the original vec starting with the second element. If the vec has fewer than two elements return None.
- Given two integers, divide a by b but returna n error if b is zero.
Closures and iterators:
- Given a vector of integers, count the number of times
5occcurs. - Given a vector of strings, make a vector of the lengths of those strings
Tests
- Given solutions to one of the two basic problems, write two tests for that function, one that tests the "happy path" and one that tests an edge case
Tips for Stack-Heap Diagrams
What to include:
- Stack frames: One for
main, one for each function call - Variables: Show name, type, and value/pointer for each variable
- Heap data: Separate heap-allocated data (String, Vec, Box, etc.) to the right
Practice Stack-Heap Diagram
fn sum_first_two(dat: &Vec<i32>) -> i32 { let first_two = &dat[0..2]; let sum = first_two.iter().sum(); // DRAW HERE sum } fn main() { let dat = vec![1,2,3,4]; let result = sum_first_two(&dat); }
Final tips
Sources for practice:
- Review the confidence quiz (last lecture and online) and quesitons from this lecture
- Redo hand-coding and stack-heap problems from previous exams
- Have AI generate random graphs to practice graph algorithms on (though it may or may not be accurate in evaluting your answer)
- The activity from the iterators and closures lecture is a good source for practicing hand-coding (try Rust Playground)
- "Rubber duck" it - can you explain how these algorithms work to soemone else?
Activity L39: Ask and Answer II
Phase 1: Question Writing
- Tear off the last page of your notes from today
- Pick a codename (favorite Pokémon, secret agent name, whatever) - remember it!
Write one or two of of:
- A concept you don't fully understand ("I'm confused about...")
- A study strategy question ("What's the best way to review...")
- A practice test question
- Anything else you'd like to ask your peers ahead of the midterm
Phase 2: Round Robin Answering
- Pass papers around a few times
- Read the question, write a helpful response
- When you're done, raise you paper up and find someone to swap with
You can answer questions, explain concepts, give tips / encouragement, draw diagrams, wish each other luck
Phase 3: Return & Review
- Submit on gradescope what codename you chose for yourself
- Return the papers at the end of class
- I'll scan and post all papers - you can see the responses you got and also all others