reference

Worked examples.

Three self-contained programs under examples/. Each is runnable with go run ./examples/<name>. Read them top to bottom and you’ve seen almost every feature of the library used in anger.

1. Shift scheduling

Cover seven days of nurse demand at minimum total wage cost, choosing how many full-time and part-time nurses to staff on each day. Full shifts are 8 hours at $280; part shifts are 4 hours at $160. Saturday and Sunday have a labour-rules cap on part-time staffing.

The shape of the model is “two variables per day, one coverage constraint per day, two cap constraints” — 14 variables, 9 constraints. Run it:

go run ./examples/scheduling

And the core of main.go:

prob := grove.NewProblem("nurse_schedule", grove.Minimize)

full := make([]*grove.Var, len(days))
part := make([]*grove.Var, len(days))
for i, d := range days {
    full[i] = prob.NewVar("full_"+d, grove.Continuous, grove.Bounds(0, grove.Inf))
    part[i] = prob.NewVar("part_"+d, grove.Continuous, grove.Bounds(0, grove.Inf))
}

obj := grove.Expr{}
for i := range days {
    obj[full[i]] = fullCost  // $280 per full shift
    obj[part[i]] = partCost  // $160 per part shift
}
prob.SetObjective(obj)

for i, d := range days {
    prob.AddConstraint("cover_"+d,
        grove.Expr{full[i]: fullHours, part[i]: partHours},
        grove.GTE, demand[i])
}
prob.AddConstraint("part_cap_sat", grove.Expr{part[5]: 1}, grove.LTE, 4)
prob.AddConstraint("part_cap_sun", grove.Expr{part[6]: 1}, grove.LTE, 4)

The output is a cost and a staffing schedule plus shadow prices per day — those duals tell you what another nurse-hour of demand would cost you, which is what you quote to your capacity planner.

Things to try: change part_cap_sat to grove.LTE, 0 and watch the weekend cost jump; set a demand value high enough that Infeasible fires; turn on prob.Verbose = true and watch the simplex pivot through Phase II.

2. VM resource allocation

The LP relaxation of a multi-dimensional knapsack. Given a fixed CPU/RAM/disk budget and a set of workloads with per-workload resource demands and per-workload business value, choose a fraction of each workload to schedule to maximise total value.

go run ./examples/allocation

The model is small and mechanical — one variable per workload, one constraint per resource dimension:

prob := grove.NewProblem("vm_allocation", grove.Maximize)

frac := make([]*grove.Var, len(workloads))
for i, w := range workloads {
    frac[i] = prob.NewVar(w.name, grove.Continuous, grove.Bounds(0, 1))
}

obj := grove.Expr{}
for i, w := range workloads {
    obj[frac[i]] = w.value
}
prob.SetObjective(obj)

cpuExpr, ramExpr, diskExpr := grove.Expr{}, grove.Expr{}, grove.Expr{}
for i, w := range workloads {
    cpuExpr[frac[i]]  = w.cpu
    ramExpr[frac[i]]  = w.ram
    diskExpr[frac[i]] = w.disk
}
prob.AddConstraint("cpu",  cpuExpr,  grove.LTE, cluster.cpu)
prob.AddConstraint("ram",  ramExpr,  grove.LTE, cluster.ram)
prob.AddConstraint("disk", diskExpr, grove.LTE, cluster.disk)

The real point of this model is the sensitivity output: the dual on the binding resource is exactly what an extra unit of that resource is worth in dollars. If CPU is binding and its dual is $4.50/vCPU, then buying vCPU at anything below $4.50/hr is a good deal; above that, you should be buying RAM or disk instead.

Because the variables are Continuous, the model permits fractional workloads. In reality you want whole ones, which is the MIP version — that lands in v0.4 (see Integer variables). Until then the LP is a useful upper bound and priority signal.

3. Stigler diet

The original LP. In 1945 the economist George Stigler asked “what is the cheapest combination of foods that meets a human’s daily nutritional needs?” and worked out the answer by hand. His paper is widely credited as one of the LPs that motivated Dantzig to invent the simplex method.

go run ./examples/diet

grove’s reproduction uses ten modern foods, five nutrient requirements (minimum calories, protein, fat, carbs; maximum sodium), and a cost objective. Each variable is grams of a given food, bounded to a realistic daily range.

prob := grove.NewProblem("diet", grove.Minimize)

x := make([]*grove.Var, len(foods))
for i, f := range foods {
    x[i] = prob.NewVar(f.name, grove.Continuous, grove.Bounds(0, 2000))
}

obj := grove.Expr{}
for i, f := range foods {
    obj[x[i]] = f.costPerKG / 1000.0 // per gram
}
prob.SetObjective(obj)

prob.AddConstraint("calories_min", caloriesExpr, grove.GTE, 2000)
prob.AddConstraint("protein_min",  proteinExpr,  grove.GTE, 50)
prob.AddConstraint("fat_min",      fatExpr,      grove.GTE, 30)
prob.AddConstraint("carbs_min",    carbsExpr,    grove.GTE, 200)
prob.AddConstraint("sodium_max",   sodiumExpr,   grove.LTE, 2300)

res, _ := prob.Solve()
fmt.Println(grove.SensitivityReport(prob, res))

Running it faithfully reproduces the shape of Stigler’s historic result (modulo modern food prices): one or two ingredients do almost all the work, and the rest of the basket is boundary-pushed to zero. The report at the end names the binding nutritional constraints and gives you the shadow price of each — i.e. how much cheaper the diet would get per extra allowed gram of sodium, or per relaxed calorie floor.

Reading the output of any of them

All three examples end the same way: solve, print res.Status and res.Objective, then dump grove.SensitivityReport. That report tells you which constraints are binding, what one more unit of each would buy you, and which variables have non-zero reduced cost. It’s the single most useful thing to print when a model ships — more than the primal values themselves.

The full programs live in the repo: