Some Reliability Properties of Bivariate Cumulative Residual Tsallis Entropy
- PDF / 1,577,361 Bytes
- 18 Pages / 439.37 x 666.142 pts Page_size
- 75 Downloads / 184 Views
Some Reliability Properties of Bivariate Cumulative Residual Tsallis Entropy David Chris Raju1 · S. M. Sunoj1 · Rajesh G1
© Grace Scientific Publishing 2020
Abstract Rajesh and Sunoj (Stat Pap 60(3):583–593, 2019) have introduced an alternative form of cumulative Tsalli’s Entropy for continuous random variables and studied its importance in reliability studies. The present study aims at extending this measure to the bivariate case based on two types of conditioning, viz. conditionally specified and survival models. In a two component system, it measures the uncertainty associated with one component when the other component is either failed or survived a specified period of time. Characterization results are established for important bivariate lifetime models. We also propose an empirical and kernel-based plug-in estimators for conditional cumulative Tsalli’s entropy and study its performance using simulated data. A real data analysis is also carried out to evaluate the usefullness of the empirical estimator of a bivariate dynamic cumulative Tsalllis entropy and compared its performance with that of similar estimator due to Sati and Singh (J Appl Math Inf 35(1–2):45–58, 2017). Keywords Tsalli’s entropy · Cumulative residual entropy · Reliability measures · Bivariate models · Empirical survival function Mathematics Subject Classification 94A17 · 62N05
* S. M. Sunoj [email protected] David Chris Raju [email protected] Rajesh G [email protected] 1
Department of Statistics, Cochin University of Science and Technology, Cochin, Kerala 682 022, India
13
Vol.:(0123456789)
63
Page 2 of 18
Journal of Statistical Theory and Practice
(2020) 14:63
1 Introduction The notion of entropy plays an important role in the measurement of the uncertainty of a probability mechanism. In statistical mechanics, if a system is out of equilibrium or its component states depend strongly on one another, Tsallis entropy ([11, 29]) found to be a better measure and it measures the disorder in macroscopic sys∑ � 1 � 1 − p𝛼i , where pi denote the probability mass tems. It is defined as T𝛼 = 𝛼−1 function of the ith component and 𝛼 is called the Tsallis index. A continuous counterpart of T𝛼 is also popular in the literature. For the continuous random variable X, Tsallis entropy of order 𝛼 is defined as ( ) ∞ ( ) 1 1 𝛼−1 𝛼 T𝛼 (X) = 𝛼−1 E 1 − (f (X)) = 𝛼−1 1 − (f (x)) dx , (1.1) ∫0
where 0 < 𝛼 ≠ 1 and f(x) denote the probability density function (pdf) of X. ∞ When 𝛼 → 1 , T𝛼 (X) → H(X) = − ∫0 (log f (x))f (x)dx , the Shannon differential entropy (Shannon [26]). Tsallis entropy has been found applications in various fields such as statistical mechanics, thermodynamics, communication theory, image processing, reliability, etc. For more details, one can refer to Cartwright [6], Kumar [13] and the references therein. Recently, Rao et al [19] introduced the cumulative residual entropy (CRE), alternative to Shannon differential entropy, obtained by replacing the (pdf f(x) )in H(X) ∞ ̄ ̄ ̄ = P(X > x) , given by 𝜉(X) = ∫0 ln F(x)
Data Loading...