Home > Numeric Optimization in Golang With Nelder-Mead and Gonum

Numeric Optimization in Golang With Nelder-Mead and Gonum

Numeric optimization is a set of techniques for fitting a function to a dataset by iteratively refining estimates. Numeric optimization is one way to do things like curve-fitting and parametric estimations. We use it in some time series forecasting techniques we implemented recently, where the forecasting models use parameters such as seasonality and trend that we can’t determine in advance. We use numeric optimization to select the optimal parameters based on the specific dataset we’re trying to forecast. The gonum package is a great set of numeric libraries for Go. It has a numerical optimization package. In this post I’ll show how to fit a line to 2D points with the optimize package. This is the simplest example of how to do numerical optimization with the package. It’s a good illustration of how to set up an optimization, and helps show how to get started so you can do more complicated things if you want. To illustrate the technique, I used R to generate a sample dataset. It’s very simple, just some points scattered around a sloped line by adding some randomness to the line. The magic slope and intercept values to keep in mind are 2.38 and 8.29. We’ll try to fit a line through the points, and see if we recover those parameters. Here’s how we generated the dataset:
x<-seq(1,20,1)
noise<-rnorm(20,0,5)
y<-2.38*x+8.29 + noise
plot(x,y)
Here's what the plot looks like:
Picture1-2.png Now let’s see how to fit a line through these points with the Nelder-Mead optimization algorithm. Nelder-Mead works by drawing smaller and smaller areas with different values for the parameters, until they converge to a local minimum or maximum (hence the “optimization” terminology). Here’s what that looks like as an animation in 2D: Nelder-Mead_Himmelblau.gif Source: https://commons.wikimedia.org/wiki/File:Nelder-Mead_Himmelblau.gif Nelder-Mead needs a function that accepts the parameter values, and returns a score. The algorithm works by finding a way to minimize the score (that’s what “optimization” really means). The choice of function and score is up to the user, but the obvious thing to do is make the function calculate the function you’re trying to fit, and score based on the error, so the error gets minimized. You could do something simple or something fancy; I decided to let the score be the sum of the residuals: how far each point is from the line generated from the parameters. The smaller the score, the closer the line is to all of the points. (Familiar linear regression, by the way, minimizes the sum of squares of residuals). Here’s the score function: func(x[]float64)float64 {
sumOfResiduals:=0.0
m :=x[0]//slope
b := x[1]//intercept
for _, p:= range points {
actualY:= p[1]
testY:= m*p[0] + b
sumOfResiduals += math.Abs(testY-actualY)
}
return sumOfResiduals
} And here’s the full program, including the test dataset with random noise: package main
import (
"log"
"math"
"gonum.org/v1/gonum/optimize"
)
func main() {
points:= [][2]float64{
{1, 9.801428},
{2, 17.762811},
{3, 20.222147},
{4, 18.435252},
{5, 12.570380},
{6, 20.979064},
{7, 24.313054},
{8, 21.307317},
{9, 26.555673},
{10, 27.772882},
{11, 41.202046},
{12, 44.854088},
{13, 40.916411},
{14, 49.013679},
{15, 37.969996},
{16, 49.735623},
{17, 48.259766},
{18, 50.009173},
{19, 61.297761},
{20, 58.333159},
}
problem := optimize.Problem{
Func:func(x []float64) float64 {
sumOfResiduals:= 0.0
m := x[0]//slope
b := x[1]//intercept
for _, p := range points {
actualY := p[1]
testY := m*p[0] + b
sumOfResiduals += math.Abs(testY-actualY)
}
return sumOfResiduals
},
}
result, err := optimize.Local(problem, []float64{1, 1}, nil, &optimize.NelderMead{})
if err!= nil {
log.Fatal(err)
}
log.Println("result:", result.X)
} When you run that, you’ll get the following: result: [2.5546852775670477 7.241390060062128] Those two values represent the optimized slope and intercept parameters. If you plot the points and a line with the optimized parameters, you get this: Picture3.png
Pretty good fit! Let’s see how that compares to ordinary least squares regression, as computed by R.
>lm (y~x)
Call:
lm(formula = y ~ x)
Coefficients:
(Intercept) x
8.402      2.491
Picture4.png
The least squares result is in red. Pretty close too! Finally, let me add the true line in blue with the slope and intercept values I used to generate the test dataset, 2.38 and 8.29.   Picture5.png
Note that in my example there’s only one place where Nelder-Mead is mentioned. That’s because the optimize package is abstracted enough that I can easily swap out the actual algorithm. Overall, I think the optimize package is very well organized and easy to work with, and I look forward to trying out more of the gonum collection!
SolarWinds
We’re Geekbuilt.® Developed by network and systems engineers who know what it takes to manage today's dynamic IT environments, SolarWinds has a deep connection to…
Read more