**1. Introduction**

Huge amounts of ECG data are nowadays collected worldwide due to achievements made in the storage of media technology during the last decade. Data compression, as it has been identified in the 1990s, is no longer a necessary condition for the operation of long-term Holter recorders or wireless sensors. Nevertheless, the scientific problem of intelligent adaptive coding remains valid [1–5] and, in the context of cardiac-based home care and surveillance, smart solutions have considerable impact on performance and costs. As such systems commonly use a wireless link, millions of recording hours are difficult to manage and cause high expenses for data transmission [6–8].

The ECG is sampled with a constant frequency, mainly for the reason of commodity, despite the full bandwidth of the data stream being used only for a very limited period within the QRS complex (of a typical duration of 100 ms). In the remaining part of the heartbeat, the discrete time series is significantly oversampled causing high correlation of neighboring samples unless blurred with noise. A family of short-time decorrelation techniques uses this feature for signal compression (e.g., [9,10]). The oversampled sections are used as a reference for local measurement of noise level (e.g., [11]) or as a host of watermark data (e.g., [12]). A class of algorithms perform compressed sensing based on local statistics (e.g., [13–15]) or adaptive sampling based on the mutual data dependence (e.g., [16,17]) but with no regard for distribution of medical relevance in the record.

In previous research projects we have studied the irregular distribution of medical information in the ECG record with use of various methods: local bandwidth of ECG waves [18], susceptibility of diagnostic result to signal distortion caused by a local data loss, and local conspicuity of the ECG trace [19,20]. Results reported in the latter work include a generalized quantitative estimate of local temporal distribution of the electrocardiogram medical relevance, here referred to as generalized medical relevance function (gMRF). This function is briefly recalled hereafter and proposed as a background for the adaptive ECG sampling technique presented in this paper.

Due to non-uniform distribution of diagnostic information in the ECG, simple metrics for time series comparisons such as signal to noise ratio (SNR) or percentage root-mean-square difference (PRD) do not adequately represent the degradation of quality of diagnostic results. For the same reasons the lossy compression of the ECG is distrusted [3,8,10], and currently not allowed in clinical applications. At the same time, various techniques for ECG recording with constant sampling frequency ranging from 125 to 1000 Hz are used depending on the medical goals and provide either more concise or more detailed background for diagnostic analyses. This justifies the seamless adaptation of the sampling frequency in relation to the patient status and—like in case of the proposed method—with progression of the cardiac cycle.

Different phases of the cardiac cycle can be distinguished in the surface-recorded ECG as a representation of sequence of cell action potentials in the heart conduction system and myocardium. Due to different electric properties of the conducting tissue, waves representing the progression of the cardiac cycle show different bandwidths as the stimulus travels through the heart. They also show significant variability in duration on a beat-to-beat basis not directly related to the RR interval even in normal rhythms. In case of abnormal rhythms, the order of waves in the sequence may also be altered: in ectopic beats P-wave is absent and in atrial fibrillation a continuous wave is observed instead of P-wave. Therefore, the progression of the cardiac cycle is adequately represented by the time relative to wave borders, that in turn become adequate reference points for estimation of medical information density in each individual heartbeat. Several algorithms were developed for automatic recognition and delineation of ECG waves and proven to perform with precision and accuracy acceptable for medical use. They can roughly be classified as signal derivative-, geometric template- or semantic sequence-based. The methods by Almeida et al. [21], by Martinez et al. [22], and by Dumont et al. [23] are based on the discrete wavelet transform (DWT). Other delineation methods reported use continuous wavelet transform (CWT) [24].

In this paper we propose a method for adaptive sampling of the electrocardiogram. The local sampling interval is driven by the expected conspicuity of the trace (i.e., medical relevance) calculated for a given progression of the cardiac cycle relative to its beginning. Therefore, a procedure for automated wave borders detection is necessary to control the piecewise projection of the gMRF to each particular heartbeat. The rest of this paper is organized as follows. In Section 2 the idea and processing scheme are presented together with a method for transforming the ECG trace conspicuity to an estimate of local information density, details on piecewise adaptation of the local relevance function, and ECG signal resampling. In Section 3 the evaluation of the method is reported with details on test signal set, error metrics and experiment results. In Section 4 a discussion is presented together with concluding remarks.
