The Turing Machine Paradigm in Contemporary Computing

The importance of algorithms is now recognized in all mathematical sciences, thanks to the development of computability and computational complexity theory in the 20th century. The basic understanding of computability theory developed in the nineteen thir

  • PDF / 2,232,394 Bytes
  • 17 Pages / 547.087 x 685.984 pts Page_size
  • 47 Downloads / 207 Views

DOWNLOAD

REPORT


1. Introduction The importance of algorithms is now recognized in all mathematical sciences, thanks to the development of computability and computational complexity theory in the 20th century. The basic understanding of computability theory developed in the nineteen thirties with the pioneering work of mathematicians like Godel, Church, Turing and Post. Their work provided the mathematical basis for the study of algorithms as a formalized concept. The work of Hartmanis, Steams, Karp, Cook and others in the nineteen sixties and seventies showed that the refinement of the theory to resource-bounded computations gave the means to explain the many intuitions concerning the complexity or 'hardness' of algorithmic problems in a precise and rigorous framework. The theory has its roots in the older questions of definability, provability and decidability in formal systems. The breakthrough in the nineteen thirties was the formalisation of the intuitive concept of algorithmic computability by Turing. In his famous 1936-paper, Turing [43] presented a model of computation that was both mathematically rigorous and general enough to capture the possible actions that any 'human computer' could carry out. Although the model was presented well before digital computers arrived on the scene, it has the generality of describing computations at the individual bit-level, using very basic control commands. Computability and computational complexity theory are now firmly founded on the Turing machine paradigm and its ramifications in recursion theory. In this paper we will extend the Turing machine paradigm to include several key features of contemporary information processing systems.

1.1 From Machines to Systems that Compute Turing's model is generally regarded as capturing the intuitive notion of what is algorithmically computable in a very broad sense. The general belief that every algorithm can be expressed in terms of a Turing machine, is now known as the Church-Turing thesis (cf. [12,37]). Even though many other formalisations of computability have been proposed and studied through the years, there normally proved to exist effective simulations in terms of standard Turing machines. This has been specifically demonstrated for the many formalisms for defining computable (partial) functions on N (cf. [8,15,26]).

* This research was partially supported by GA CR grant No. 201198/0717 and by EC Contract IST-1999-14186 (Project ALCOM-FT).

B. Engquist et al. (eds.), Mathematics Unlimited — 2001 and Beyond © Springer-Verlag Berlin Heidelberg 2001

1140

J.

VAN LEEUWEN·

J.

WIEDERMANN

With the origin of Turing machines dating back to the nineteen thirties, the question arises whether the notion of 'computation' as it is now understood is still adequately described by them. The emphasis in modem computer technology is gradually shifting away from individual machines towards the design of systems of computing elements that interact. New insights in the way physical systems and biological organisms work are uncovering new models of comput