Just like Rome has seven famous hills, we also have seven (maybe not so famous) hills in my hometown Turku. Me and my friend have a plan to arrange a walk where we visit every hill once. As we want to avoid unnecessary walking, we are faced with the classical travelling salesman problem: given the locations of the hills, we need to find the shortest route that visits each hill once and returns to the origin hill.

As there are only seven points to visit and the cost function (distance) is simple, this is an easy thing to do just…

Python implementation of the powerful Gauss-Newton optimization method. All the code can be found from my GitHub repo.

Here I’m going to present an implementation of the gradient-based optimization algorithm, Gauss-Newton. The implementation is done from scratch using only NumPy as a dependency.

I will not go deep into theory or mathematics. For that purpose, there is plenty of good sources available, e.g. Wikipedia. Instead, I will try to provide a more intuitive explanation that is based on a geometric interpretation of the method.

You might already know the well-known gradient-based optimization method, gradient descent. Gradient descent calculates derivative (or…

This blog post describes algorithm(s) to handle shifting peaks in signal separation and curve fitting problems.

All the related code can be found from my GitHub repo.

The basic problem of signal separation problem here can be formulated as follows: given the pure component signals and mixture signal that is a linear combination of the pure component signals, one needs to solve the contributions of individual components. This problem appears in many different signal processing applications.

In many cases, the problem is easy to solve using good old classical least squares fit method. The method works pretty well in many…