Intelligent Tools
In previous sections, I have discussed software environments for performing simulations and the core algorithms upon which they rely. A first step in constructing a simulation environment is to construct a useful toolbox containing specific modules for setting up, running, and analyzing different types of numerical experiments. However, to be fully functional, such an environment requires more than just a set of tools, no matter how flexible they may be. Another ingredient needed is an increasing degree of intelligence built in directly into the tools. This will give the human experimenter more a role of a manager, rather than a bench worker. By taking away much of the drudgery of (numerical) laboratory work, the researcher is then free to focus on the science, rather than the bookkeeping details of the experiments. Here I will mention two areas with which I am most familiar.
Automatization of Scattering Experiments
Just as particle physicists perform scattering experiments in the laboratory, astrophysicists perform virtual scattering experiments in numerical labs. Instead of shooting elementary particles at a target plate, we shoot single stars at a `target plate' of binary stars, to investigate the statistical properties of the gravitational three-body interactions.
On modern computers, such experiments are easy and cheap: a typical three-body orbit integration takes much less than a second, from start to finish, and it is therefore quite feasible to do millions of experiments in order to obtain accurate results for scattering cross sections and reaction rates. Writing a program for the orbit integration is simple too, since one does not have to worry about all the complexities inherent in the vast discrepancies in length and time scales in star cluster simulations, given that everything is effectively localized.
The main problem in developing gravitational scattering software lies elsewhere: it is the creation of intelligent tools that know how to choose the right initial parameters for setting up experiments, and in addition can learn from their mistakes by choosing increasingly better parameters to improve the accuracy of the final results. For example, even determining the value of one cross section, for a given encounter velocity and fixed binary properties and values of the masses of the stars, is not trivial. If one chooses too high a value for the impact parameter (the perpendicular offset of the direction of the incoming star from aiming straight at the center of the binary), most of the encounters will be pure misses where no significant interactions take place. This can lead to a large waste of computer time. But choosing too small a value of the impact parameter can lead to a serious underestimate of the cross section, by overlooking the effects of those wider encounters that still show significant interactions.
The challenge is to let the computer program determine its own choice of impact parameters, not only its range but also its sampling density. Ideally, small impact parameters should be sampled densely, since that is where most of the action occurs. Wider impact parameters where interactions still occur but are fewer should be sampled more sparingly. And finally, the program should automatically set up a safety zone around the binary where it spends at least a few percent of its time sampling the outcomes, to make sure that no significant interactions take place there. If such interactions are found nonetheless, the whole safety zone should be adjusted by widening it, iteratively, until nothing of importance happens there anymore. To make things more complicated, the question of what is important depends on the type of question one wants to answer: a determination of exchange rates requires much smaller impact parameters than a determination of the cross section for minor disturbances in the eccentricity of a binary.
Similar considerations occur on a higher level for the choice of scattering cross sections to measure. In order to determine a reaction rate at a fixed temperature (a fixed velocity dispersion, in terms of stellar dynamics), one has to measure a number of cross sections, each for different relative velocities, and an optimally efficient choice of velocities depends again sensitively on the temperature, and to some extent also on the process for which the rates are required. All in all, a considerable amount of artificial intelligence has to be build in into gravitational scattering packages, making the writing of such programs both challenging and fun. I have been involved in a series of such studies, starting with the paper Binary - Single Star Scattering. I. Numerical Experiments for Equal Masses, by Hut, P. & Bahcall, J.N., 1983, Astrophys. J. 268, 319-341.
A description of our automated package for gravitational three-body scattering is given in Binary - Single Star Scattering. VI. Automatic Determination of Interaction Cross Sections, by McMillan, S. & Hut, P., 1996, Astrophys. J. 467, 348-358. The software described there is freely available, as part of our Starlab environment.
Laboratory Assistants
The use of intelligent software to aide in laboratory experiments will become an increasingly important feature in stellar dynamics simulations, in all aspects of the work. Gravitational three-body scattering is just one example where choosing initial conditions, as well as analyzing the results, require quite a bit of intelligence. There are other places where software is playing the role of laboratory assistants. In simulations of hundreds of thousands of stars, handling extremely rare and unexpected types of close encounters have to be orchestrated on the fly, since it is simply impossible to make a list of all conceivable categories of events. For example, a binary-binary binary (two double stars in orbit around each other) may encounter a triple star, leading to a protracted five-body dance. While this is going on, another triple or quadruple star may join the fray, leading to hierarchical sets of close encounters. We have developed automatic tools to handle those situations, and implemented them in our kira code.
We have described our views of the role of artificial intelligence in a popular article Advanced Computing for Science, by Hut, P. & Sussman,G. J., 1987, Scientific American, 255, 145-153. followed later by the article Extending the Mind, by Hut, P. & Sussman, G.J. 1995, in Scientific American: Triumph of Discovery, a Henry Holt reference book, (New York: Henry Holt), pp. 197-199. See also our contribution On Toolboxes and Telescopes, by Hut, P. & Sussman, G.J., 1986, in The Use of Supercomputers in Stellar Dynamics, eds. P. Hut and S. McMillan (Springer), pp. 193-198.
A related paper is A Laboratory for Gravitational Scattering Experiments, by Hut, P., 1989, in Applications of Computer Technology to Dynamical Astronomy, I.A.U. Colloq. 109, Celest. Mech., 45, 213-218. There I discuss the extension of three-body scattering experiments using stars to two-body scattering experiments using whole galaxies. At the time of writing that article, a single galaxy-galaxy encounter still took quite a bit of computer time. However, given that computers have increased in speed by orders of magnitude since then, it has become feasible to carry out thousands of such scattering experiments full-automatically, while having the software test the outcome of each experiment to make a classification of the resulting galaxy remnants.
A beautiful framework and toolbox for classical mechanics, ideal for constructing laboratory assistants in software, is presented in the book Structure and Interpretation of Classical Mechanics, by G. J. Sussman and J. Wisdom, with Hardy Mayer (M.I.T. Press), for which I wrote the following book review, by Hut, P.; 2002, in Foundations of Physics 32,323-326.