Optimizing for happiness

[latexpage]
Let $f(x)$ be a convex mood function, with happiness at its minimum. With all the swings of moods, $f(x)$ is not necessarily differentiable, and although it may be steep at parts, we assume it is continuous.

Then, we wish to solve the minimization problem
\[
\min_x f(x)
\]

We propose to use a subgradient method. The intuition is that you may pick a direction for your mood to change. Any subgradient at your current point $x$ will point upwards of the curve (less happy), so we take a step towards the opposite direction. There are a variety of ways to select a time step, e.g. exact line search, backtracking. Without loss of generality and for simplicity, we use a constant step size.

Bottom line: If you just smile a little more, you’ll end up happier.

Subgradient method for convergence on happiness

given a starting point $x \in \text{dom } f$
given an error parameter $\epsilon \ll 1$
$\text{stop} = \text{false}$
while $! \text{ stop}$ do

  • $\Delta x \leftarrow -\partial f(x)$ % Select a subgradient
  • $t \leftarrow \epsilon$ % Line search
  • if $x + t\Delta < x$ and $\abs{t \Delta x} \leq \epsilon$ then $\text{stop} = \text{true}$
  • $x \leftarrow \min(x, x + t\Delta x)$ % Update

end while

(Thanks, Pranjal, for the inspiration.)

Leave a Reply