Understanding Environmental Changes Using Statistical Mechanics
- PDF / 2,130,063 Bytes
- 13 Pages / 439.37 x 666.142 pts Page_size
- 35 Downloads / 176 Views
Understanding Environmental Changes Using Statistical Mechanics M. Selim Mahbub1 · Paulo de Souza1 · Ray Williams1 Received: 28 May 2018 / Revised: 17 April 2019 / Accepted: 8 May 2019 © Springer-Verlag GmbH Germany, part of Springer Nature 2019
Abstract We present results for Shannon entropy from environmental data, such as air temperature, relative humidity, rainfall and wind speed. We use hourly generated time-series hydrological model data covering the whole of Tasmania, a state of Australia, and employ concepts from statistical mechanics in our calculations. We also present enthalpy and heat capacitance equivalent quantities for the environment. The results capture interesting seasonal fluctuations in environmental parameters over time. Our results also present an indication that corresponds to a slight increase in the number of microstates due to air temperature over the duration of data considered in this work. Keywords Environmental analytics · Shannon entropy · Statistical mechanics
1 Introduction A proper understanding of the environment is essential in order to make better predictions of changes in the environment, including the changes in air temperature, wind speed, relative humidity etc, and these are some of the ongoing challenges for the environmental research community. With advances in numerical models, together with the development of sophisticated computers and algorithms, significant progress has been made in these areas. However, the calculations performed in these areas are based on data coming from stationary sensors where the density of sensors in the environment is low providing poor spatial resolution. Thus, calculations are prone to uncertainty. Considering an environment containing a high density of spatially distributed sensors, the data then obtained from these sensor networks would allow us to perform calculations with better spatial and temporal resolutions and hence provide better predictions of the observables of interest.
B 1
Paulo de Souza [email protected] CSIRO, Data61, Sandy Bay, TAS 7005, Australia
123
Annals of Data Science
Entropy has been a subject of broad study within the research community since its inception in the 1850’s by Rudolf Clausius [1]. Since then, successive developments have been made by Boltzmann [2], Planck and Gibbs [3,4], ultimately leading to the development of statistical mechanics. The entropy measure signifies changes in the microstates of a system and the distribution of entities within the system. It has been extensively used in various areas of research, such as thermodynamics, cosmology (i.e., entropy of black holes [5]), information theory (i.e., Shannon entropy [6]), and environmental science [7–10]. Our focus is the consideration of an environment where a high density of sensors is deployed and data are collected over a regular temporal interval. Then the data distribution is presented to the model developed using statistical mechanics to calculate statistical observables [19], such as entropy, enthalpy, and heat capacitance. An
Data Loading...