The total variation measures the change of the image of a function in the codomain. It controls both the size of jumps and the geometry of boundaries. Then, the minimization of the total variation for a modified image makes this image closer to the original at the same time it removes unwanted detail smoothing away noise in flat regions, even at low signal-to-noise ratios, whilst preserving geometric details such as edges.
If $u$ is monotonic in $[a,b]$, then $TV(u) = |u(b) - u(a)|$. For $n > 1$ variables, then the total variation of $u$ in the open interval $\Omega \in \mathbb{R}^n$ is
Total variation can be used as the regularization term in the following variational model proposed by @rudin1992nonlinear:
where the first term (regularization) is controlled by a regularization parameter $\lambda$ which measures how much the resulting function $u$ looks like the input function in the optimization process. The second term (data fidelity) measures the distance between the function $u$ and its approximation using some kind of normalized distance such as L1 and L2. By differentiating this functional with respect to $u$, we can derive a corresponding Euler-Lagrange equation, that can be numerically integrated using the original signal $f$ as initial condition. So depending the use of L1 and L2 norm, the problem consists of minimizing the following energy functionals:
where the image $f$ can be modeled as the optimized image $u$ convoluted with a kernel $K$ representing the point spread function of the optical system, and adding a white noise $n$.
For $E_2$ the model is convex with a unique global minimizer but the same is not true for $E_1$. However the TV-L1 model introduces some surprising and very useful features such as:
In the discrete version of this method, the total variation with L2-norm can be an isotropic function and not differentiable, such as the following one
or an anisotropic version of the same function with L1-norm which it may sometimes be easier to minimize, such as the following one
The same minimization strategy can be used to solve several kind of problems such as volumetric surface adjustment, disparity refinement, image denoising and image restoration.
There are several solvers for the total variation problem:
a) Euler-Lagrange methods, where an augmented lagrangian methods handle the constraints and an alternating direction method iteratevely find solutions of the subproblems b) Laplace-Beltrami differential operator defined as the divergence of the gradient (like the Laplacian) and is a linear operator taking functions into functions which can be extended to tensors as the divergence of the covariant derivative.