Posterminaries
- PDF / 114,065 Bytes
- 1 Pages / 604.8 x 806.4 pts Page_size
- 96 Downloads / 149 Views
POSTERMINARIES
By Any Other Name Hâve you noticed a dual terminology when describing nonexperimental science? Do you hâve a theory or model to explain the différence between théories and models? Hâve you also observed that greater prestige seems to accrue to théories, and likewise to theorist as opposed to modeler? Why...? Perhaps models suffer from sounding like toys or mère imitations of reality. As a student, I was taught théories. Thèse were explanations usually expressible in mathematically closed form, at least under simplifying assumptions, and were derived from rather fundamental physical laws. The invoked combination of laws produced prédictions tested by experiments which validated the theory within some limit of error. My sensé was that if anything was called a model, there was an implied modifier such as "phenomenological," "heuristic," "ad hoc" or "empirical." From this, I présume that some or ail of the relevant basic laws were unused or unknown and "seat-of-the-pants" guesswork helped produce an algorithm from which one could predict expérimental results. Limits-of-error for a theory's validation combined expérimental error with uncertainty in values of constants in the theory. For a model's validation, the additional, and a priori unquantifiable, incorrectness of the model enters as well. That is, you can't know how much of the mismatch to experiment is the model itself missing the mark. Théories do contain assumptions, and theoretical computation relies on approximations at some level. After what number, or degree of seriousness, of thèse déviations from absolute truth does a theory become a model? Lately, the term modeling seems to arise much more often. In fact, some institutions will now seek to hire modelersperse. My sensé is that thèse modelers are supposed to be a more practical breed and will not spend time with their heads in the clouds of theory for theory's sake. With the advent of more and better supercomputers, the numerical model calculation (often called
simulation) has become more prévalent (i.e., more viable for a wider range of problems). Of course, thèse machines must also enable more complex computations of the theoretical variety. Somehow, numerical simulation lacks the élégance of closed analytical forms in both cases. Perhaps one distinction to be made is between the lengthy itérative self-consistent computations of theory and the theory-based lengthy statistical simulations, such as Monte Carlo, which extend theory to large-number Systems. There's another distinction in viewpoint to reckon with. One person's theory is another's model. Or perhaps one should say a "macroperson's" theory is a "microperson's" model. For example, if plotting the logarithm of your atomic diffusion data against inverse température yields a straight line, one can safely assume that an activated process is involved. The exponential behavior and the 1/T exponent are on firm theoretical ground, but the constant in the exponent, i.e., the activatioh energy, becomes a phenomenological parameter inaccessi
Data Loading...