One of
the most elegant aspects of science is how, with mathematical modelling, we can
infer universal truths from limited data. Often direct measurement isn't
possible, but mathematics provides a reliable and consistent framework for
exploring and understanding complex systems. Some things are obviously objective.
Take, for example, Newton's F = ma - it is a fact (in macroscopic systems) whether you're a man in Nepal
or a woman in Sweden. The reason being, it has nothing to do with subjective
opinion, because force is equal to the time derivative of momentum, so the
relationship between an object's mass m, its acceleration a, and the applied
force F is F = ma wherever you are in the world (assuming Euclidian space).
Suppose a
crank in Chile decided to opine something different about the quantitative
calculations of dynamics, and he came up with a different, but provably wrong,
idea about how velocities change when forces are applied. It's an opinion to
which he can claim no justifiable entitlement, because objective facts about
reality transcend the culture and geography in which they are discovered. The
theory of evolution by natural selection is always based on factual accounts of
billions of years of biochemical history, irrespective of whether you live in
England, India or Brazil.
We know that the nuts and bolts of
creation is assessed objectively because the language of mathematics reveals
objective truths about physical reality. For example, in the early 1800s,
astronomers set out to improve the tables of predictions for planetary
positions they had created from Newtonian mechanics by undertaking calculations
of the orbit of planets in relation to their neighbouring planets. At the time,
Uranus was the farthest planet, but its calculations were proving to be
inconsistent with the rest of the planets in our solar system. One suggestion
made by several astronomers was that perhaps Uranus's behaviour proved that
Newtonian mechanics is not universal. But with further mathematical
calculations, a better proposition was offered; one that demonstrated the
predictive power of mathematics.
By taking the discrepancies in the
orbit of Uranus, astronomers were able to speculate about the possibility of a
further planet, and use Newtonian predictions to calculate the possible size
and location of a possible adjacent planet that would explain the anomalies
with Uranus's behaviour. Using Newtonian mechanics, we could predict the
potential whereabouts of a further planet (which we would later call 'Neptune')
simply by assessing the behaviour of Uranus, and that is what we did.
For another way to capture the
essence of how mathematics allows us to extrapolate from what we can measure to
what we cannot directly observe, let’s return to Newton’s second law of motion:
“A body experiencing a force F
experiences an acceleration a related to F by F = ma, where m is the mass of
the body. “
Force is equal to the time
derivative of momentum, so the relationship between an object's mass m, its
acceleration a, and the applied force F is F = ma, where acceleration and force
are vectors, and the direction of the force vector is the same as the direction
of the acceleration vector. This enables us to make quantitative calculations
of dynamics, and measure how velocities change when forces are applied.
Newton’s laws were formulated from
observations that were made on local objects; for example - dropping objects
from high places, calibrating acceleration of gravity for a falling object,
observing motional trajectories, and looking at planetary positions. Although
Newton's laws are formulated as universal statements, we can infer universality
from what we observe locally (although this isn’t an irrefutable claim).
When
Newton gave the formula for gravitational force, he claimed the law to be true
for any two masses in the universe. But what warrants that leap of induction,
and how would one develop certainty about the universality of it? For example, one doesn't directly observe the force of gravity between the earth and the
moon - it is evidenced from things like tidal effects, lunar orbit and satellite measurements. Yet we gain evidence for
scientific statements that are universal and cannot be measured directly. Mathematical
models rely on established principles and constants (like the gravitational
constant) that have been empirically derived.
What is required must be described
in terms of mass and distance – this gives us the force. However, we
cannot measure the force between the earth and any other object that we cannot
weigh on a scale. We can weigh any easy to handle localised object (a football,
or a snooker ball, or a cannonball, etc) and determine the attractive force of
gravity between the earth and the localised object, but we cannot do this with
the moon. However, what we can do in the absence of being able to hold the moon
in our hand is work it out with simple mathematics, where we can infer this
force through indirect measurements, and applying Newton's laws to predict
their trajectories.
F = G*M(moon)*M(earth)/distance^2
where G is the gravitational constant, 6.67x10^-11 m^3/kg/s^2, the mass of the
earth is 6x10^24 kg, the mass of the moon is 7.3x10^22 kg, and the distance
between them is about 3.84x10^8 m.
The force
is measured because gravitational force decreases inversely by the square of
the distance, so by measuring the distance between the earth and the moon (it
varies but its average distance is approximately 239,000 miles), and then the
earth’s radius, followed by dividing the earth’s radius into the distance
between the two objects, one gets the square result. Using mathematics we have
accomplished something that we couldn’t achieve with physical testing.
Not only does a scientific theory
work best when it is formulated such that in the Popperian sense it produces
highly falsifiable implications, one must also distil from a theory a vast
nexus of predictability – in the case of Newton’s laws - a web of implications
on the behaviour of all masses under forces including gravity*.
Given that we can distil from this
theory a vast nexus of predictability, we can infer this web of implications
from the mathematics underpinning the law – we do not have to put a body in
various regions of space and repeat-test this theory. We cannot, of course, put
the moon on a set of scales, but we do not have to, there are easier methods.
We know that the only possible orbit under Newton's Laws is an elliptical one,
and we also know that the stronger the gravity of a planet, the farther an
object can orbit. By sending a satellite to orbit the moon, we can measure its
mass quite accurately – something Newton couldn’t have done in his day, of
course.
Nowadays, we can even work out the
effect the moon has on our seas and calculate its mass, but there would be
greater margins of error if this was the only method we had. In the past, the
moon and the earth were closer together (they are moving further apart each
year at a rate of about 3cm per year) so the gravitational force would have
once been much stronger. Now of course simple calculations would tell us
that if that were the case the tides would have been higher than they are now.
Once again, levels of consistency are found in such theorising; geologists
frequently find fossilised tidemarks that demonstrate tides were higher in the
past – and of course, subject to other earthly consistencies, future tides
should become lower as the earth and moon separate further.
Given the position of an orbiting
body at two points in time, Newton's laws will also tell us where that object
will be at any point in the future. The better a theory, the greater its
predictive value, in so far as it produces accurate and useful forecasts that one
can anticipate, test and then verify or falsify. With theories such as motion,
gravity and evolution, our predictions are always confirmed with localised
evidence and simple mathematical equations. In the case of Newton, all orbits
for anything we observe are forbidden to act in a way that departs from the
predictions and implications of his own laws.
However, Newton's laws did run into
trouble in the late 1800s, as Maxwell’s theory of electromagnetism was
propounded describing all electromagnetic phenomenon and predicting the
presence of electromagnetic waves. The electromagnetic field is a field that
exerts a force on charged particles. Naturally, the presence and motion of such
particles affects the outcome. Once it was discovered that a changing magnetic
field produces an electric field, and that a changing electric field generates
a magnetic field, we were able to discover electromagnetic induction - the
discovery of which laid down the foundations for the vast array of electronic
innovations (generators, motors and transformers) that followed.
Again, the predictive value here is
essential - there must be uniformity and regularity for such endeavours to
occur. The theoretical implications of electromagnetism brought about the
development of Einsteinian relativity, from which it was evident that magnetic
fields and electric fields are convertible with relative motion – that is, the
perception of these fields changes depending on the observer's frame of
reference, particularly how electric fields can transform into magnetic fields
and vice versa depending on the relative motion of the observer and the source
- allowing us to (among other things) correctly predict how forces increase
exponentially for particles approaching the speed of light (this led us further
to knowledge of how Euclidian geometry is challenged with the knowledge that
space-time does not quite correspond to our own Euclidian intuitions, nor our
intuitive view of past, present and future). This (the electromagnetic force)
is one of the four fundamental forces of nature - it shows that the
electromagnetic field exerts a fundamental force on electrically charged
particles. Add to this the other fundamental forces; the strong and weak
nuclear forces (the former is what holds atomic nuclei together), and the
aforementioned gravitational force, and we have the four fundamental forces of
nature, from which all other correlative forces (friction, tension, elasticity,
pressure, Hooke's law of springs, etc) are ultimately derived. Aside from
gravity, the electromagnetic force affects all the phenomena encountered in
daily life - that is, all the objects we see in our day-to-day life consist of
atoms which contain protons and electrons on which the electromagnetic force is
acting. The forces involved in interactions between atoms can be traced to
electric charges occurring inside the atoms.
But even though Newton’s laws were
improved upon, they were still good approximations to reality. Newton's
universe was a mechanical universe which has been supplemented by the likes of
Maxwell, Einstein, Schrödinger, and Heisenberg, who themselves laid down the
foundations for all the 20th century physics and cosmology that was to come.
Newton appeared to be right for over three hundred years, but 19th discoveries
caused us to reassess his theories and, in this case, augment them.
The main two measures we have of a
theory’s veracity is the ability to make accurate predictions from it, and the
localised evidences for it. As Newton has shown us, all scientific theories are
only approximations of what is really at the heart of a complex nature.
Approximations are not necessarily inaccurate, but are instead simplified
models that apply under certain conditions. Newton's laws still work in
situations that are non-relativistic (that is, at speeds much less than the
speed of light), but Einstein’s theories work for both non-relativistic and
relativistic situations. Einstein, Maxwell, Schrödinger, Heisenberg and any
subsequent physicist and cosmologist all owe Newton a great debt – we are
observing that science is progressive and that theories are there to be
developed upon.
Once a theory is reached that
reconciles quantum mechanics and general relativity, we may see that
Einsteinian relativity in its current form will be viewed as inadequate. Just
as special relativity the provided a framework that included both Newtonian mechanics
(as an approximation at low speeds) and Maxwell's equations, demonstrating how
they coexist in a broader relativistic context, so too a future theory (perhaps
even a theory of multi-dimensions) will almost certainly resolve the current
tension between quantum mechanics and general relativity. And any such theory
that unifies the two would have to be consistent with separation of scales
between high and low ends of the complexity spectrum – that is, the quantum
effects which mostly deal with the "very small (that is, for objects no
larger than ordinary molecules) and the relativistic effects that deal mostly
with the "very large". Aspects of string theory, superstring theory
and quantum gravity suggest progress, but the grand theory that unifies quantum
mechanics with general relativity eludes us at present.
Applying this to epistemology, we
can see that all these theories have provided provisional approximations of
nature that produce highly accurate and useful predictions, but that by
themselves they do not encapsulate a self-consistent whole. Newton’s approximations
are better at slow speeds, but once we approach the speed of light the
discrepancies start to show, and we require what is called a "relativistic
correction" to Newton's predictions. But although we do not accept
scientific theories as if they are the final word on a cosmic universal truth,
we know that because we can test their implications with degrees to which
certainty prevails at the greater universal levels than at the local levels (we
rely on mathematics to prove this) we actually possess greater degrees of
certainty about the universal levels than we do the local levels.
If we were merely non-mathematical
creatures relying on local evidential observations the best we could do is
postulate simple deduction with some further intrepid attempts at induction.
But with laws, axioms and the vast nexus of contingency that is woven into the
mathematical fabric we can make grand theories at the universal level that we
know will apply at the local level too. Because of our mathematical fecundity
we have made predictions about, and found consistency in, masses, motion, and
forces, and we are as certain about these as we are about most localised
discoveries.
There is
an important balance to be stuck between the broad applicability and predictive
power of universal laws and the localised contexts. For example, Newton's the
law of universal gravitation applies to all objects - it applies equally to
planets, apples, and snooker balls. But when observing a snooker ball on a
table, local factors such as friction, air resistance, and imperfections in the
surface of the baize can introduce uncertainties. These local conditions can
make precise predictions about the ball's motion more challenging than applying
the general laws to predict gravitational interactions between celestial
bodies. The cosmological model of the Big Bang provides a framework for
understanding the cosmic narrative of the universe as a whole, predicting
phenomena like cosmic microwave background radiation and the large-scale
structure of the universe. But at the local level, when trying to model the
formation of a specific star or planet, numerous local variables (such as the
presence of nearby stars, gas density, and magnetic fields) can introduce
complexities and uncertainties that unsettle the potential accuracy of
predictions. Or the laws governing radioactive decay are consistent and can be
predicted with high accuracy over large populations of atoms (for example,
calculating the half-life of a particular isotope). But predicting the exact
moment when a specific atom will decay is inherently uncertain due to quantum
mechanics. These local uncertainties does not detract from the reliability of
the universal law; they simply illustrates how local predictions can be less
reliable despite the overall framework being robust.
Science may not provide us with all
the answers, but its own rewards are evident by the human progress it has
ushered in; science by its very definition should always lead to progression,
and every Kuhnian paradigm shift ought to qualitatively supersede the last. It
is easy to look back into history and be under the illusion that many of these
advancements were quick and easy, but they were not. Einstein’s relativistic
standpoint didn’t swiftly refine the framework of classical mechanics to accommodate Maxwell’s electromagnetic standpoint, yet
the retrospective viewpoint may give us the illusion that these transitions
were smooth. When one thinks of the many other transitions; not just from
Newton, Maxwell and Faraday to Einstein, Schrödinger and Heisenberg, but from
the Ptolemaic cosmological view to the Copernican view; from classical
mechanics to quantum mechanics; from Becher’s phlogiston theory to Lavoisier's
caloric theory of combustion, right through to the science of thermodynamics;
from Lamarckian inheritance to Darwinian natural selection and the
reconsideration of Lamarck’s ideas with ‘epigenetics’ which identify possible
inheritances of acquired traits - what these shifts (and many others) ought to
tell us is that we are always in transition and ought to be prepared for black
swans and new knowledge that will augment our present foundations.