Master data

Title: An extension of minimization based formulations and the projected gradient method with some applications
Description:

An inverse problem of reconstructing $x$ such that $A(x,u)=0$ from the data $y$ satisfying $C(u)=y$ can be written in the minimization form $$\text{\rmfamily argmin}_{(x,u)} \{ \mathcal{J} (x,u;y): (x,u) \in M_{\text{\rmfamily ad}} (y) \}.$$ The classical approaches usually consider $\mathcal{J} (x,u;y) = \frac{1}{2} \| C(x,u) - y\|^2$, $M_{\text{\rmfamily ad}} = \{(x,u): A(x,u)=0\}$ or $\mathcal{J} (x,u;y) = \frac{1}{2} \| A(x,u) \|^2$, $M_{\text{\rmfamily ad}} = \{(x,u): C(x,u)=y\}$. Here we follow a new approach assuming that the observation operator $C$ can be inverted on its range, as is the case, e.g., in the practically relevant setting of a finite dimensional observation space. In this case, we can split $u$ into two parts: the observed data part $\tilde u$ and the homogeneous data part $\hat u$, $u = \tilde u + \hat u$ where $C(\hat u) = 0$ and $C(\tilde u) = y$. Let $C^{\text{\rmfamily ri}}$ be an right inverse operator of $C$, then the problem becomes to find $x$ such that $$A(x, C^{\text{\rmfamily ri}}(y) + \hat u) = 0, \quad \hat u \in \text{\rmfamily Ker} (C) := \{ \hat u: C(\hat u) = 0\}$$ with the new minimization form which is to find $$(x, \hat u) \in \text{\rmfamily argmin}_{(x, \hat u)} \{ \mathcal{J} (x,\hat u;y): (x,\hat u) \in M_{\text{\rmfamily ad}} (y).$$ Under some more conditions on the smoothness of $\mathcal{J}$ with respect to $y$, we can prove the \emph{well-definedness}, \emph{stability} and \emph{convergence} of minimizers.

In practice, often an iterative regularization method will be applied to reconstruct $x$. Here we want to mention the projected gradient method $$x_{k+1} = \text{\rmfamily Proj}_{M_{\text{\rmfamily ad}}} (x_k - \mu_k \nabla J (x_k))$$ where $\text{\rmfamily Proj}_{M_{\text{\rmfamily ad}}}$ is the projection onto $M_{\text{\rmfamily ad}}$ and $x$ stands for $(x,\hat u)$, for brevity. This method can be applied to many problems. Here we show numerical results with a Matlab implementation for three examples: inverse groundwater filtration (GWF), impedance acoustic tomography (IAT), and electrical impedance tomography (EIT).


Keywords:
Type: Guest lecture
Homepage: https://www.math.aau.at/talks/64/pdf
Event: Doctoral Seminar in Mathematics (Klagenfurt)
Date: 16.12.2020
lecture status: stattgefunden (online)

Participants

Assignment

Organisation Address
Fakultät für Technische Wissenschaften
 
Institut für Mathematik
Universitätsstraße 65-67
9020 Klagenfurt am Wörthersee
Austria
   math@aau.at
https://www.aau.at/mathematik
To organisation
Universitätsstraße 65-67
AT - 9020  Klagenfurt am Wörthersee

Categorisation

Subject areas
  • 101014 - Numerical mathematics
  • 101028 - Mathematical modelling
  • 101016 - Optimisation
Research Cluster No research Research Cluster selected
Focus of lecture
  • Science to Science (Quality indicator: III)
Classification raster of the assigned organisational units:
Group of participants
  • Mainly national
Published?
  • No
working groups No working group selected

Cooperations

No partner organisations selected