Research on ECG Signal Compression Algorithm Based on Compressed Sensing
A biometric system is a science that uses statistical methods to identify and verify human identity based on individual physiological or behavioral characteristics. In this paper, the problem of limited resources of portable remote cardiac signal recognit
- PDF / 589,315 Bytes
- 7 Pages / 439.37 x 666.142 pts Page_size
- 29 Downloads / 213 Views
Introduction With the development of information technology, people pay more and more attention to information security and personal privacy protection. The demand for safe and reliable authentication systems is also increasing, and identity recognition is closely related to people’s lives. A biometric system is a science that uses statistical methods to identify and verify human identity based on an individual’s physiological or behavioral characteristics. However, the collection of extremely convenient biometrics is vulnerable to counterfeiting attacks, and the more secure biometric collection is too difficult, so it is necessary to adopt a relatively balanced biometric of safety and convenience. Bioelectrical signals, such as ECG signals and pulse signals, are characteristic signals inside the human body and reflect important physiological information inside the human body. Compared with common identification methods such as fingerprint and face [1], bioelectrical signals have uniqueness and confidentiality, with extremely low probability of being forged and high reliability when used for identification. Electrocardiogram (ECG) signals are bioelectrical signals produced by cardiac activity. Body recognition based on ECG is a relatively new and rapidly developed method [2]. Compared with face, fingerprint, iris, and other biometrics, ECG signals have the advantages of a living body, convenient collection, and less theft, which gradually attracts wide attention in the field of identity recognition.
2 Compressed Sensing Theory Compressed sensing is also known as Sparse sampling. It was proposed by Donoho et al. in 2004. Different from the traditional Nyquist sampling theorem, the redundant information in the signal is removed and compressed after the signal is sampled. The compressed sensing also completes the data compression while sampling, and then the original target signal is restored without distortion through the numerical optimization algorithm, avoiding the waste of resources. According to this theory, sparse signals can accurately recover the signals to be recovered under a small amount of sampling, which can complete the sampling of test data by means of linear projection [3]. Traditional Shannon compression sampling and CS compression sampling processes are shown in Fig. 1. © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 C. H. WU et al. (eds.), Recent Developments in Intelligent Computing, Communication and Devices, Advances in Intelligent Systems and Computing 1185, https://doi.org/10.1007/978-981-15-5887-0_44
306
X. Jian et al.
N-dimensional compressible signal x
N-dimensional compressible signal x
N point sampling signal x High speed sampling
x=Ψμ transform
Sparse transformation x=Ψμ
K sampling points
Storage, transmission Reconstructed signal Obtained by linear Encode K interpolation of sinc sample points function
Observed the Mdimensional vector y=ФΨμ
Storage, transmission
The reconstructed signal uses the optimizatio
Data Loading...