Simple OFDM Error Floor Prediction with Sub-optimal Time Sampling

  • PDF / 1,102,947 Bytes
  • 12 Pages / 439.37 x 666.142 pts Page_size
  • 10 Downloads / 202 Views

DOWNLOAD

REPORT


Simple OFDM Error Floor Prediction with Sub‑optimal Time Sampling Adriana Lipovac1 

© Springer Science+Business Media, LLC, part of Springer Nature 2020

Abstract The in-service measurable block-error-rate (BLER) is the only transmission performance indicator aimed for wireless network operators to test the physical-layer of the long-term evolution and the emerging 5G new radio systems, both utilizing the orthogonal frequencydivision multiplexing (OFDM). However, the small so obtained BLER values may not be accurate, which compromises estimation of the residual channel that the physical layer delivers to higher-layer protocols, finally affecting the application-layer performance and efficiency. Moreover, this is not the only reason why the out-of-service testing of the biterror-rate (BER)—still the ultimate physical-layer performance indicator representing the probability of a bit error, remains inevitable in research, development and equipment manufacturing. Specifically, it is often needed to estimate the peak OFDM performance usually referred to as error floor or residual BER (which can be used for BLER estimation), straight from common and physically interpretable parameters. So, in this paper, after identifying the optimal sample instant in a power delay profile, the sub-optimal sampling just upon the first pulse arrival, is found to add minor residual BER degradation, and therefore can be adopted to provide a simple OFDM error floor prediction that is linear with the rms delay spread. The proposed model is verified by Monte Carlo simulations. Keywords  Residual BER · LTE · 5G NR · Time-sampling

1 Introduction Although the emerging fifth-generation (5G) wireless systems dramatically enhance data rates and latency achieved by the matured fourth-generation (4G) wireless networks—the long-term evolution (LTE) in particular, both use the orthogonal frequencydivision multiplexing (OFDM) [1, 2]. This not only implies persistency of the main OFDM impairments [3], but also the continuation of the long-ago started evolution of the physical layer performance standards from the classical bit-error to the block-error oriented ones [4], which has finally completely dismissed the bit-error-rate (BER) from * Adriana Lipovac [email protected] 1



Department of Electrical Engineering and Computing, University of Dubrovnik, Dubrovnik, Croatia

13

Vol.:(0123456789)

A. Lipovac

specifications, retaining just the block-error-rate (BLER) as the unique physical-layer performance indicator in the network operator environment [5]. Although BLER testing is sufficient in most aspects important to network operators, such as e.g. performing it in-service, by just counting the hybrid automatic repeat request (HARQ) protocol retransmission requests relative to all transmissions (and so not interrupting the revenue-generating services), it may provide small BLER values of poor accuracy, due to potentially non-reliable retransmission requests transfer via the reverse channel [6]. However, accurate estimation of the lowest achiev