An empirical estimation for time and memory algorithm complexities: newly developed R package
- PDF / 1,435,790 Bytes
- 19 Pages / 439.642 x 666.49 pts Page_size
- 21 Downloads / 215 Views
An empirical estimation for time and memory algorithm complexities: newly developed R package Marc Agenis-Nevers1 · Neeraj Dhanraj Bokde2 · Zaher Mundher Yaseen3 · Mayur Kishor Shende4 Received: 8 November 2019 / Revised: 8 July 2020 / Accepted: 28 July 2020 / © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract When an algorithm or a program runs on a computer, it requires some resources. The complexity of an algorithm is the measure of the resources, for some input. These complexities are usually space and time. The subject of the empirical computational complexity has been studied in the research. This article introduces GuessCompx which is an R package that performs an empirical estimation on the time and memory complexities of an algorithm or a function, and provides a reliable, convenient and simple procedure for estimation process. It tests multiple increasing-sizes samples of the user’s data and attempts to fit one of seven complexity functions: O(N), O(Nˆ2), O(log(N)), etc. In addition, based on the best fit procedure using leave one out-mean squared error (LOO-MSE), it predicts the full computation time and memory usage on the whole dataset. Together with this results, a plot and a significance test are returned. Complexity is assessed with regard to the user’s actual dataset through its size (and no other parameter). This article provides several examples demonstrating several cases (e.g., distance function, time series and custom function) and optimal parameters tuning. Keywords Time complexity · Memory complexity · Empirical approach · R package · Algorithm complexity
Zaher Mundher Yaseen
[email protected] 1
Epicentre Factory, Clermont-Ferrand, France
2
Department of Engineering - Renewable Energy and Thermodynamics, Aarhus University, Aarhus, Denmark
3
Sustainable Developments in Civil Engineering Research Group, Faculty of Civil Engineering, Ton Duc Thang University, Ho Chi Minh City, Vietnam
4
Government College of Engineering, Nagpur, India
Multimedia Tools and Applications
1 Introduction Complexity of an algorithm is a measure of resources amount the algorithm needs to run the completion, for a given input [14, 17]. Usually, the resources of an algorithm require either space or time, which represent two types of complexities [8, 29]. The time complexity is the total amount of time required by an algorithm to complete its execution [30]. The running time of an algorithm may differ for the same input sizes, depending on the type of data [1, 2, 28]. For instance, in bubble sort algorithm, if the input array is already sorted then the running time of the algorithm will be lower [19]. This is what is so called the best case complexity. Similarly; if the input list is in reverse order, the algorithm will take the maximum time capacity, which is called worst case complexity. If the input list is neither sorted nor reversed, then the running time is less than worst case but more than best case: this is the average case complexity. Since an algorithm cannot require
Data Loading...