Inverse problems
Uncertainty and errors in mapping data to model
The mapping from data
to model encapsulates the core of inverse problems. Within the
framework of first-order perturbation theory, these partial
derivatives of misfit functions with respect to model parameters are
calculated upon a large number of solutions to the forward problem,
which poses a formidable computational task. Using AXISEM, we developed an efficient technique to
deliver time- and frequency-dependent waveform kernels which
represent the raw mapping between seismogram and structure. We
compute these kernels upon raw Green tensors in the frequency
domain, allowing for flexible posteriori adjustments such as
filtering, misfit selection, model parameterization, time
windowing. In particular, this allows for using diffracted waves and
generally any fraction of a seismogram into large-scale
multi-frequency tomography at any desired seismic frequency. This is
computationally feasible only since we explore the dimensional
reduction within the axisymmetric spectral-element method, and thus
the crux of these 3D kernels simply lies in rotating and convolving
the generic once-and-for-all 2D wavefields for a given
source-receiver pair. Such time-dependent kernels describe the
instantaneous 3-D region of perturbations with respect to a
reference model that a particular instant within a seismogram may
“feel”. I have applied the method to constraining the dependence of
time-dependent 3-D seismic sensitivity on different misfit
functions, source radiation patterns, frequency ranges, earthquake
depths, epicentral distances, azimuths, and respectively addressed
the sensitivity of source- and receiver-side region as well as
numerical errors and inaccuracies in selecting time windows. The
effect of these choices on the data-to-model mapping is significant,
in many cases larger than the effect of different background
models.
In a attempt to quantify the effect of large
heterogeneities on waveforms and tomographic imaging, we compute
synthetic seismograms for models in the core-mantle boundary region
under- and overlain by a spherical symmetric Earth. The statistics of
misfit anomalies are computed using the reference symmetric spherical
Earth model. We analyze the impact of three-dimensional propagation on
these statistics and the validity range of the Born approximation when
trying to recover the structure and the amplitude of the various
anomalies.
Optimizing tomographic data selection & processing
(T. Nissen-Meyer, A. Fournier)To enhance robustness and resolution of imaging capabilities for a given region of interest, we strive to quantify and automate some of the data selection and processing tasks for large-scale tomographic inversions by analyzing the nature and parameter dependencies of seismic sensitivity kernels. This is achieved by formulating an optimization problem for a given 3D region, and computing spatio-temporal seismic sensitivity kernels for seismograms upon an earthquake as a function of source frequency and depth, radiation pattern, receiver component, epicentral distance, azimuth, time windows and filtering of the seismogram, misfit parameters (e.g. traveltime versus waveforms), and model parameterization (wavespeeds versus elastic moduli). The optimization problem is explicitly multi-modal, in that we strive to find all possible extrema in the sensitivity within this multi-dimensional parameter space, and as such can only be tackled via brute-force or Bayesian approaches. However, our method’s capability to consider all these free parameters independently after the computation of the respective forward solution is used to construct a primary region-of-interest assessment of optimal data selection: Given a location of geophysical interest and sufficient data coverage, we readily supply a set of e.g. phases, frequency ranges, epicentral distances, and receiver components that maximizes sensitivity and resolution and may then be used in a tomographic inversion. In the light of immense data availability (e.g. USArray), this may serve as a valuable tool to efficiently select and process data which offer best possible resolution, or predict optimal illumination strategies for future array or ocean deployments.