One of the prime tool in non-invasive cardiac electrophysiology is the recording of an electrocardiographic signal (ECG) which analysis is greatly useful in the screening and diagnosis of cardiovascular diseases. However, one of the greatest problems is that usually recording an electrical activity of the heart is performed in the presence of noise. The paper presents Bayesian and empirical Bayesian approach to problem of weighted signal averaging in time domain which is commonly used to extract a useful signal distorted by a noise. The averaging is especially useful for biomedical signal such as ECG signal, where the spectra of the signal and noise significantly overlap. Using the methods of weighted averaging are motivated by variability of noise power from cycle to cycle, often observed in reality. It is demonstrated that exploiting a probabilistic Bayesian learning framework leads to accurate prediction models. Additionally, even in the presence of nuisance parameters the empirical Bayesian approach offers the method of theirs automatic estimation which reduces number of preset parameters. Performance of the new method is experimentally compared to the traditional averaging by using arithmetic mean and weighted averaging method based on criterion function minimization.
The aim of this paper is to compare the efficiency of various outlier correction methods for ECG signal processing in biometric applications. The main idea is to correct anomalies in various segments of ECG waveform rather than skipping a corrupted ECG heartbeat in order to achieve better statistics. Experiments were performed using a self-collected Lviv Biometric Dataset. This database contains over 1400 records for 95 unique persons. The baseline identification accuracy without any correction is around 86%. After applying the outlier correction the results were improved up to 98% for autoencoder based algorithms and up to 97.1% for sliding Euclidean window. Adding outlier correction stage in the biometric identification process results in increased processing time (up to 20%), however, it is not critical in the most use-cases.