0 avis
Terahertz time domain spectroscopy data processing: analysing uncertainties to push boundaries
Archive ouverte : Communication dans un congrès
International audience. Terahertz spectroscopy provides information on the motion of the charges in a sample at a picosecond scale. To recover this information from Terahertz time-domain spectroscopy (THz-TDS), one usually extracts the experimental refractive index then fits these curves. This approach suffers from several limitations, among them the difficulty to compare models of motions, provide the error bar associated with the extracted magnitude and a resolution limitation coming from the Fourier criteria of the fast Fourier transform. By adopting a Bayesian framework taking into account the experimental uncertainties and directly fitting the time-domain trace, we overcame these limitations. When correlated and epistemic uncertainties/noise are present, the algorithm considers its distribution as part of the data to fit and can mistake it for real physical features. Hence, it offers poor discrimination between good models and bad ones. After a thorough analysis of the experimental noise, we developed a preprocessing software removing epistemic noise on the time traces and providing an estimate of the noise correlation matrix (generalization of the standard deviation). It allows the proper weighting of the error function of the fit using these uncertainties and therefore the derivation of the Akaike information criteria, a metric enabling to calculate the most probable model from a set of models one wants to compare. In addition, by being in the time domain we avoid the Fourier criteria for the resolution and thus could get information on experimental lines down to 30 MHz with a commercial THz-TDS system.