Advent of Code 2024: Day 3 - A Tale of State and Style
Its day three of Advent of Code.
Today's challenge evolved from a straightforward exercise in multiplication into an elegant dance of state management and instruction parsing.
Here I am again, with Rust, Elixir, Haskell, and Go to solve it.
The Challenge: A State Machine in Disguise
What started as a simple task - finding multiplication instructions and summing their products - transformed when part 2 introduced its do()
and don't()
instructions. The straightforward calculator evolved into a full-fledged state machine, alternating between enabled and disabled states, deciding when to multiply and when to stay quiet, a very interesting problem.!
The Plot Twist
Simple multiplication: mul(2,3) -> 6
With state control: do() mul(2,3) -> 6
don't() mul(4,5) -> still 6!
do() mul(1,2) -> now 8
Four Languages, Four Philosophies
Let me take you through how each language shaped my thinking about state management. This is where the real insights emerged.
Rust: The Type System Whisperer
const MUL_PATTERN: &str = r"mul\((\d{1,3}),(\d{1,3})\)";
const CONTROL_PATTERN: &str = r"mul\((\d{1,3}),(\d{1,3})\)|do\(\)|don't\(\)";
lazy_static! {
static ref MUL_RE: Regex = Regex::new(MUL_PATTERN).unwrap();
static ref CONTROL_RE: Regex = Regex::new(CONTROL_PATTERN).unwrap();
}
#[derive(Debug, Clone, PartialEq)]
struct Multiplication {
x: i32,
y: i32,
}
#[derive(Debug, Clone, PartialEq)]
enum Instruction {
Multiply(Multiplication),
Enable,
Disable,
}
impl FromStr for Instruction {
type Err = Error;
fn from_str(s: &str) -> Result<Self> {
if s == "do()" {
Ok(Self::Enable)
} else if s == "don't()" {
Ok(Self::Disable)
} else {
Ok(Self::Multiply(s.parse()?))
}
}
}
Rust's type system naturally guided the solution's architecture. The FromStr
trait implementation provides safe parsing, while the derive
macros add useful functionality without boilerplate. The use of lazy_static
for regex compilation shows Rust's attention to performance without sacrificing safety.
Elixir: Pattern Matching Poetry
defmodule Solution do
@type instruction :: [binary()]
@type acc_state :: {integer(), boolean()}
def solve(input) do
~r/(mul)\((\d{1,3}),(\d{1,3})\)|do\(\)|don't\(\)/
|> Regex.scan(input)
|> process_instructions()
end
defp process_instruction(["do()"], {acc, _}), do: {acc, true}
defp process_instruction(["don't()"], {acc, _}), do: {acc, false}
defp process_instruction([_, "mul", x, y], {acc, true}),
do: {acc + String.to_integer(x) * String.to_integer(y), true}
defp process_instruction([_, "mul", _, _], {acc, false}), do: {acc, false}
end
In Elixir, each instruction pattern became its own function clause, reading almost like English. The state flows through as a tuple of {accumulator, enabled_flag}
, transformed immutably with each instruction. Notice how the type specifications (@type
) make the code self-documenting while the regex pattern ensures precise instruction matching.
Haskell: Pure and Proud
{-# LANGUAGE BangPatterns #-}
{-# LANGUAGE LambdaCase #-}
data Instruction
= Multiply !Int !Int
| Enable
| Disable
data State = State
{ stateSum :: {-# UNPACK #-} !Int
, stateEnabled :: !Bool
}
processInstruction :: State -> Instruction -> State
processInstruction !state@(State sum enabled) = \case
Enable -> State sum True
Disable -> State sum False
Multiply x y
| enabled -> State (sum + x * y) enabled
| otherwise -> state
solve :: String -> Int
solve = stateSum . foldl' processInstruction initialState . findInstructions
Haskell's approach was characterized by its pure functional roots. The design uses language pragmas and strict annotations ({-# UNPACK #-}
, !
) for performance while maintaining mathematical elegance. The solution is remarkably concise, using function composition (.
) to chain operations from parsing to state processing. The LambdaCase
extension provides clean pattern matching syntax, while BangPatterns
ensures strict evaluation for better performance.
Go: Interface-Driven Design
type Instruction interface {
Execute(state *State)
}
type State struct {
Sum int
Enabled bool
}
type Multiplication struct {
X, Y int
}
func (m Multiplication) Execute(state *State) {
if state.Enabled {
state.Sum += m.X * m.Y
}
}
type EnableInstruction struct{}
func (EnableInstruction) Execute(state *State) {
state.Enabled = true
}
type DisableInstruction struct{}
func (DisableInstruction) Execute(state *State) {
state.Enabled = false
}
Go takes an object-oriented approach with interfaces, showing its strength in composing behavior through simple, clear abstractions. Each instruction type implements the Execute
method, encapsulating its state-modifying behavior. This design demonstrates Go's pragmatic approach to polymorphism, using interfaces to define behavior while keeping the implementation straightforward and predictable.
Parsing: The Unsung Hero
The way each language approached parsing revealed deeper insights about their design philosophies. Rust's combination of lazy_static
and regex shows how it achieves zero-cost abstractions, the regex patterns are compiled once and reused efficiently. Elixir's approach with a single powerful regex and named captures demonstrates its strength in text processing, while the pattern matching makes the code declarative. Haskell took a more fundamental approach with custom parsers, using its powerful type system and monadic composition to build complex parsers from simple pieces. Go's interface-based design allowed it to separate parsing concerns from execution logic, showing how interfaces can create clean abstractions without sacrificing simplicity.
Reflections on State
What started as a simple exercise in multiplication became a window into how different paradigms approach state and behavior composition. Rust's type system pushed me to think about states and instructions as distinct types, with the compiler ensuring correctness. Elixir showed how pattern matching combined with immutable state transformations can create clear, maintainable code. Haskell demonstrated that even with strict performance annotations (UNPACK
, BangPatterns
), we can maintain pure functional elegance. Go's solution was particularly interesting, using interfaces not just for polymorphism but as a way to encapsulate state transitions, showing that object-oriented principles can lead to clean, modular designs.
As I continue this Advent of Code journey, these different perspectives are becoming as valuable as the solutions themselves.