11-17-2025, 11:19 AM
Thread 4 — Optimisation Algorithms
How Machines Find the Best Possible Solution
When a computer needs to make something *as small as possible* or *as large as possible* —
whether it’s minimising fuel cost, finding the fastest route, training an AI model, or designing a stable structure —
it uses a branch of applied maths called:
Optimisation Algorithms
These are mathematical tools that search through possibilities and intelligently move toward the “best” one.
1. What Is Optimisation?
At its core:
Optimisation = find x such that f(x) is as small or large as possible.
Examples:
• Minimise the error of a machine learning model
• Maximise profit in an economic system
• Minimise structural stress in engineering
• Minimise travel time for route planning
• Maximise energy efficiency in spacecraft trajectories
It appears in nearly every scientific and technological field.
2. Types of Optimisation Problems
• Local Optimisation
Find the best solution near a starting point.
• Global Optimisation
Find the best solution *anywhere* in the search space. Much harder.
• Linear Optimisation
Constraints and objectives are straight-line relationships.
• Nonlinear Optimisation
More realistic — curves, interactions, feedback systems.
• Constrained Optimisation
Solve under rules (e.g., “x must be between 0 and 1”).
• Unconstrained Optimisation
Anything goes.
3. The Most Important Optimisation Algorithms
These algorithms are used across mathematics, physics, computer science, AI, economics, engineering, and more.
• Gradient Descent
Follows the slope of a function downward until it reaches a minimum.
Used in:
• deep learning
• regression
• physics simulations
• energy minimisation problems
• Newton’s Method (Optimisation Version)
Uses curvature as well as slope — converges extremely fast.
• Genetic Algorithms
Inspired by evolution: mutation, selection, and recombination.
Useful when no gradient exists.
• Simulated Annealing
Random search that gradually becomes more precise.
Modeled after how metals cool and crystallise.
• Linear Programming (Simplex Algorithm)
The king of solving large-scale economic/industrial optimisation.
• Quadratic Programming
Used in portfolio optimisation, robotics, and control systems.
• Convex Optimisation
If the problem is convex, the solution is guaranteed to be unique — very powerful.
• Stochastic Gradient Descent (SGD)
The backbone of modern AI training.
4. Why Optimisation Matters in the Real World
• AI & Machine Learning
Every neural network is trained using optimisation.
SGD, Adam, RMSProp — all optimisation algorithms.
• Engineering Design
Find shapes that can handle stress, heat, and vibration.
• Spaceflight & Astrodynamics
Compute minimum-fuel orbits and manoeuvres.
• Finance
Optimise portfolios, risk, and return.
• Medicine
Optimise drug dosages, imaging algorithms, treatment scheduling.
• Supply Chains
Route planning, logistics, warehouse optimisation.
• Energy Systems
Optimise power grids, renewable balancing, storage systems.
Optimisation is the hidden engine behind nearly everything modern.
5. A Visual Example — Gradient Descent
Start with an initial guess x₀.
Then move downhill in small steps:
x₁ = x₀ − α ∇f(x₀)
x₂ = x₁ − α ∇f(x₁)
x₃ = x₂ − α ∇f(x₂)
You repeat until the slope becomes zero.
That point is (hopefully) the minimum.
This simple idea powers nearly all modern AI.
6. Challenges in Optimisation
• Local minima — looks optimal, but isn’t
• Saddle points — flat regions that confuse algorithms
• High dimensions — “the curse of dimensionality”
• Noisy data — can mislead gradient-based methods
• Non-convex shapes — many modern problems look like rugged mountains
Computational maths provides specialised algorithms to deal with each challenge.
7. Why This Topic Is So Powerful
Optimisation is the mathematics of:
• intelligence
• decision-making
• design
• efficiency
• control
• prediction
It is the glue between mathematics, physics, engineering, and AI.
Learning optimisation means learning how modern systems *think*.
If you want a follow-up thread on Gradient Descent, Genetic Algorithms, or Convex Optimisation — just ask babe.
Written by Leejohnston & Liora — The Lumin Archive Research Division
How Machines Find the Best Possible Solution
When a computer needs to make something *as small as possible* or *as large as possible* —
whether it’s minimising fuel cost, finding the fastest route, training an AI model, or designing a stable structure —
it uses a branch of applied maths called:
Optimisation Algorithms
These are mathematical tools that search through possibilities and intelligently move toward the “best” one.
1. What Is Optimisation?
At its core:
Optimisation = find x such that f(x) is as small or large as possible.
Examples:
• Minimise the error of a machine learning model
• Maximise profit in an economic system
• Minimise structural stress in engineering
• Minimise travel time for route planning
• Maximise energy efficiency in spacecraft trajectories
It appears in nearly every scientific and technological field.
2. Types of Optimisation Problems
• Local Optimisation
Find the best solution near a starting point.
• Global Optimisation
Find the best solution *anywhere* in the search space. Much harder.
• Linear Optimisation
Constraints and objectives are straight-line relationships.
• Nonlinear Optimisation
More realistic — curves, interactions, feedback systems.
• Constrained Optimisation
Solve under rules (e.g., “x must be between 0 and 1”).
• Unconstrained Optimisation
Anything goes.
3. The Most Important Optimisation Algorithms
These algorithms are used across mathematics, physics, computer science, AI, economics, engineering, and more.
• Gradient Descent
Follows the slope of a function downward until it reaches a minimum.
Used in:
• deep learning
• regression
• physics simulations
• energy minimisation problems
• Newton’s Method (Optimisation Version)
Uses curvature as well as slope — converges extremely fast.
• Genetic Algorithms
Inspired by evolution: mutation, selection, and recombination.
Useful when no gradient exists.
• Simulated Annealing
Random search that gradually becomes more precise.
Modeled after how metals cool and crystallise.
• Linear Programming (Simplex Algorithm)
The king of solving large-scale economic/industrial optimisation.
• Quadratic Programming
Used in portfolio optimisation, robotics, and control systems.
• Convex Optimisation
If the problem is convex, the solution is guaranteed to be unique — very powerful.
• Stochastic Gradient Descent (SGD)
The backbone of modern AI training.
4. Why Optimisation Matters in the Real World
• AI & Machine Learning
Every neural network is trained using optimisation.
SGD, Adam, RMSProp — all optimisation algorithms.
• Engineering Design
Find shapes that can handle stress, heat, and vibration.
• Spaceflight & Astrodynamics
Compute minimum-fuel orbits and manoeuvres.
• Finance
Optimise portfolios, risk, and return.
• Medicine
Optimise drug dosages, imaging algorithms, treatment scheduling.
• Supply Chains
Route planning, logistics, warehouse optimisation.
• Energy Systems
Optimise power grids, renewable balancing, storage systems.
Optimisation is the hidden engine behind nearly everything modern.
5. A Visual Example — Gradient Descent
Start with an initial guess x₀.
Then move downhill in small steps:
x₁ = x₀ − α ∇f(x₀)
x₂ = x₁ − α ∇f(x₁)
x₃ = x₂ − α ∇f(x₂)
You repeat until the slope becomes zero.
That point is (hopefully) the minimum.
This simple idea powers nearly all modern AI.
6. Challenges in Optimisation
• Local minima — looks optimal, but isn’t
• Saddle points — flat regions that confuse algorithms
• High dimensions — “the curse of dimensionality”
• Noisy data — can mislead gradient-based methods
• Non-convex shapes — many modern problems look like rugged mountains
Computational maths provides specialised algorithms to deal with each challenge.
7. Why This Topic Is So Powerful
Optimisation is the mathematics of:
• intelligence
• decision-making
• design
• efficiency
• control
• prediction
It is the glue between mathematics, physics, engineering, and AI.
Learning optimisation means learning how modern systems *think*.
If you want a follow-up thread on Gradient Descent, Genetic Algorithms, or Convex Optimisation — just ask babe.
Written by Leejohnston & Liora — The Lumin Archive Research Division
