Lessons from three decades of IT productivity research: towards a better understanding of IT-induced productivity effect
- PDF / 944,746 Bytes
- 47 Pages / 439.37 x 666.142 pts Page_size
- 52 Downloads / 214 Views
Lessons from three decades of IT productivity research: towards a better understanding of IT‑induced productivity effects Stefan Schweikl1 · Robert Obermaier1 Received: 7 June 2019 / Accepted: 1 October 2019 © Springer Nature Switzerland AG 2019
Abstract New developments in the fields of artificial intelligence or robotics are receiving considerable attention from businesses, as they promise astonishing gains in process efficiency—sparking a surge of corporate investments in new, digital technologies. Yet, firms did not become per se more productive, as labor productivity growth in various industrial nations has decelerated in recent years. The fact that the adoption of innovative technologies is not accompanied by productivity increases has already been observed during the dawn of the computer age and became known as Solow’s Paradox. Thus, this paper takes stock of what is known about the Solow Paradox, before incorporating the findings into the debate of the current productivity slowdown. Based on an in-depth review of 86 empirical studies at the firm level, this paper uncovers various reasons for the emergence of the Solow Paradox, debates its following reversal marked by the occurrence of excess returns and deduces a model of factors influencing the returns on IT investments. Based on these insights, four overarching explanations of the modern productivity paradox namely adjustment delays, measurement issues, exaggerated expectations and mismanagement are discussed, whereby mismanagement emerges as a currently neglected, but focal issue. Keywords IT investment · Information technology · Productivity · IT productivity paradox · Solow Paradox · Industry 4.0 JEL Classification A12 · O31 · O33
* Stefan Schweikl stefan.schweikl@uni‑passau.de Robert Obermaier robert.obermaier@uni‑passau.de 1
Chair of Business Economics, Management Accounting and Control, University of Passau, Innstraße 27, 94032 Passau, Germany
13
Vol.:(0123456789)
S. Schweikl, R. Obermaier
1 Introduction Advances in modern technologies like robotics, 3D printing, internet of things (IOT), artificial intelligence (AI), blockchain, virtual reality (VR) or augmented reality (AR) are creating furor among individuals and firms. They promise a socalled fourth industrial revolution with disruptive innovations that could change not only everyday life, but also entire industries forever (Obermaier 2019). Due to the enormous potential of these technologies, companies are afraid to end up on the losing side of a digital revolution if they do not adopt these disruptive technologies themselves. As a result, global business expenditure on AI, machine learning and robotic process automation is, for instance, expected to reach $232 billion in 2025, which marks a 18-fold increase from the estimated spending of $12.4 billion in 2018 (KPMG 2018). Despite these efforts of companies to digitize their operations, they are struggling to become more productive. In fact, US labor productivity growth fell from an average of 2.73% per year from 2000 through 2
Data Loading...