Wednesday, 19 December 2018

Misunderstanding Inequality: Heroes & Villains

The world has seen unprecedented economic growth in recent times, alongside which the world has seen unprecedented increases in capital inequality. Many people feel joy about the first fact and despair about the second fact. So much so, in fact, that many of the nation's problems seem to get blamed on inequality - even the fact that some families are struggling to make ends meet.
A moment's thought should make it obvious that the problem with someone's economic hardship is their absolute state of well-being, not their relative well-being in relation to rich people. Alas, despite being obvious, it is doubted by many - primarily, one imagines, because of a) envy, and b) being tripped up by the fixed pie fallacy. The reality is, economic growth naturally engenders marked differences in people's incomes. In fact, much of the time one person's large wealth increase affords many others an opportunity to work, as anyone who has ever been in McDonald's Sainsbury's or Domino's would know.
It's also the case that economic growth actually dispenses disproportionately greater benefits to the poorest in society - because not only does it create jobs, which creates money to spend, which creates more jobs, and so on - it also improves their consumption. Once upon a time only the richest people televisions, cars and mobile phones. Now most people have these things, not to mention access to the entire world's knowledge at the touch of a button, and the uncountable other riches that our forebears would have thought impossible.
The upshot is, wealth inequality is a so-called problem that's hugely overblown. As long as everybody's absolute well-being and standard of living continues to increase, the income gap isn't much of a problem - quite the contrary, it's a natural non-linear feedback effect of a free market of voluntary exchanges.
In fact, monetary inequality with concentrations of capital in the top quintile is actually evidence that a lot of people are becoming better off by purchasing those goods and services through voluntary transactions. As the gap between richest and poorest is narrowing in pretty much every area apart from capital, it's easy to see how much more equal we are becoming, not more unequal.
Inequality through the lens of discrimination
Moreover, as I explained in this Blog post, most inequality is due to rational decisions made by people - called statistical discrimination - that play out in terms of society's revealed preferences. In terms of incentives, statistical discrimination is one of the most easy to understand in society, despite some people's distaste for it.
Consider when the European Court of Justice (ECJ) ruled that the long-established practice of setting insurance prices according to sex is illegal discrimination. Or consider that even though women statistically live longer than men, insurance companies are no longer allowed to offer different annuity values to men and women.
So men come off better on car insurance, effectively being subsidised by safer driving women, which is the same as saying that women come off worse by paying a penalty for less safe male drivers. Both those situations are reversed in the annuity situation.
When everyone in a particular group is homogenised, the statistical variances are cancelled out as individuals are assessed based on the characteristics of the group as a whole, not on their own merits and demerits. In many cases this is obviously foolish and wrong. It's much better if men and women are not treated the same in terms of insurance premiums, because women ought to be rewarded for being statistically safer drivers.
The inevitable consequences are that some very unsafe women drivers benefit on the back of the average safety of women drivers, and some very safe male drivers lose out because of the average safety of male drivers. Safe male drivers are discriminated against not proximally because of their sex, but more distally because they are pooled with characteristics that we commonly associate with less safe driving.
Another example of where discrimination occurs when people are pooled with a group that have an easily identifiable weighted average is that if women are more likely to leave work in their thirties to have children, then some employers are more likely to choose men, even if it means missing out on better talent. Some cry foul of unfair discrimination, but sometimes it is a rational thing to do, as employers look to increase the probability of stable utility and efficiency.
But that's not the end of it. Suppose 32 year old Jack and 32 year old Jill are going for a job as project manager in Bob's company. Everything else is equal, so Bob hires Jack purely on the following probabilistic grounds: that there is a chance that Jill may wish to have time off for motherhood and possibly revert to part time hours thereafter. But now suppose the above scenario again, apart from one difference - Jill is slightly better than Jack, but doesn't get hired because Jack is less of a risk in terms of future motherhood. Jill goes on to get a job as project manager in Margaret's company, has a child five years later and returns to work after six months.
Bob's preference for Jack over Jill when they were equal was probably a rational choice, but when Jill was better, Margaret gained by recruiting a better project manager, whereas Bob gained a decent project manager too. In other words, rational discrimination usually produces a levelling effect, and employers know that irrational discrimination is an imprudent recruitment policy that hits them in the pocket.
The underlying reality about statistical discrimination is twofold: a) it's almost impossible to detect in the first place because people's real motives are not in full view of the public; and b) in a society that values liberty and freedom of choice, people should be perfectly free to statistically discriminate any way they wish. Generally I favour the egalitarian, classical liberalism (of the Hume, Smith, Ricardo kind), meritocratic ethos, and the view that individual pursuits and a bit of luck play important parts in our journey, as does the 'reap what you sow' maxim.
Consequently, then, while I'd hope for equality of opportunity wherever it doesn't unfairly disadvantage others, I don't expect equality of outcomes, and I think many people trying to interfere in society to correct things that don't need correcting are, quite naturally, misjudged and making the situation worse. There are several, often connected and complementary (what should be more obvious) reasons why unequal outcomes occur, and why that is no bad thing:
1) The effort people put in to things is unequal, which should rightly yield unequal outcomes. Those who work hard and study hard have a better chance of being rewarded for their efforts - and that should be encouraged. A surgeon should be higher on the income ladder than a taxi driver or a tyre fitter, and I don't want society to be less unequal in this regard.
2) The risks and inconveniences people take are unequal, which should rightly yield unequal outcomes. People who do jobs with highly scalable outputs, risky jobs, dangerous jobs and jobs with unsocial shift patterns should be paid commensurate to these factors, and once again I don't want society to be less unequal in this regard either.
3) The "slings and arrows of outrageous fortune" inequality. This one is often overlooked or not given enough attention, but there are quite natural inequalities by the fact that nature is not very democratic at all. When it comes to health, looks, size, shape, talents, intelligence, sensory apparatus, opportunity and background, nature is far from democratic - there is a notable difference in all of these human qualities in each of us, as their attainment depends on undemocratic things like fortune and pursuit. A significant proportion of outcomes in society are down to luck, serendipity of circumstance and being in the right place at the right time, bringing about expected inequality of outcomes.
4) Rewards for innovation in a 'winner-takes-all' market. Most of the world's biggest gaps in income equality are because of innovators, entrepreneurs and job creators (usually one and the same) who have become wealthy by being good at providing things many people want. Market are democratic in that consumers vote with their purchasing habits, and therefore inequalities of this kind are not a problem that needs addressing - they are the result of freely made human choices in a competitive marketplace.
What so many get wrong in this area of discourse is in the misattribution of causes for outcomes. It's a base fallacy narrative that, unless corrected, will continue to misinform them about the so-called unfairness and injustices in society. Inequalities that have legitimate causes based on the above four explanations are often misrepresented as societal injustices and misattributed to the plight of human infirmities - something I've blogged about numerous times before.




Sunday, 11 November 2018

The Circuit Board Of Mental Excellence

Minds are perhaps best thought of like circuit boards - and a coherent, consistent, accurate worldview is represented by a set of small lights that all stay illuminated in conjunction with truth and facts. Information, in the form of propositions, is like enlarging the circuit board, and increasing the assembly of data circuits and the copper that delivers electricity to illuminate the lights.

The size of the light display depends on the dimensions of the worldview, which itself primarily depends on the time, effort and intellectual rigour put in to the process of building a larger and larger circuit board that can facilitate an ever-increasing light display. A genius polymath may have a light display about the size of a football field; a highly intelligent polymath may have one about the size of a tennis court; and an average person may have one about the size of a small back garden.

Now here's the key thing. You can increase the dimensions of the light display by learning more, and by increasing the connectivity of your mental artillery - and to some extent, most people do this on their life journey. But what they don't do enough is stand back and check the light display to see how many of the bulbs have gone out. Nor do they step back and observe clusters of light patterns that have gaps because the additional bulbs required to illuminate the pattern more comprehensively are missing.

A large, light display with all the bulbs illuminated is extremely rare - but it ought to be the primary objective for anyone who strives for truth, facts, proficiency of reasoning and excellence of mind. The beauty of the light display is that its consistency of illumination is exhibition to the fact that one's ideas, thoughts, views and propositions fit nicely into the rest of the circuit board of experience. If you keep getting things right, the new beliefs operate consistently with the structural workings of the circuit board, not disrupting the electricity to other items on the board. But if you get things wrong, and introduce faulty viewpoints into the equation, you disrupt the electricity flow, both to local illumination clusters, but also to isolated bulbs elsewhere in the display.

Naturally, given the complexity of the mind, and the complexity of everything there is to know, and the near infinite ways to perceive reality, this is a really epistemologically intractable model of analysis. But it isn't that difficult to identify practical examples of how the malfunctioning of the circuit board may occur, as most people host mutually contradictory or incongruent ideas, especially due to identity-based dispositions, cognitive biases, emotional self-preservation and propensities for over-simplicity.

For example, suppose you're a Christian with a fairly comprehensive understanding of scripture, but because of your upbringing you've been infected with the cultural virus of young earth creationism, with a limited recourse for correctives. The cluster of lights pertaining to Biblical exegesis and hermeneutics is going to be affected as the connection between the conductors supplying the electrical power will be shorted. The high current flow of falsity will put out some of the theological lights, and prevent other bulbs from being added to the cluster. This will mean others see you as a Christian with inadequacies in several areas of discourse - especially when it comes to Biblical interpretation of text and other related areas of science and how the edifice operates and functions - which will have a corollary effect on the consistency of your worldview, and on the impression and influence you have on the world as a Christian.

Here's another example. Suppose you're quite economically and politically astute, but you become duped into believing that there is a systematically unfair gender pay gap in the UK, or that price controls on housing might alleviate the shortage. As with the first example, your circuit board will be negatively impacted, lights will go out in various areas across the display, and there will be patches in the clusters that never get the bulbs required for a full illumination of the pattern. You might start believing that price controls in other areas of the economy will start to do some further good; or you might start over-exaggerating the extent to which climate change alarmism is fruitful; or you might take your eye of the principles behind the law of least effort; or you might develop too much a victim-mentality, and so forth.

I could offer loads more examples, but I think the gist of this is crystal clear: there is always a price to pay for bad ideas, false beliefs and inadequate reasoning - and these things infect; first at the individual level, then at the familial level, then at the community level, and then more widely across societies and even nations as falsehoods spread memetically.

By equal measure, though, there are always rewards for building a prodigious circuit board that provides the power to a fully illuminated light display that consistently, coherently, factually and truthfully supports the ideas, views and beliefs associated with all the major and minor subjects, and the interconnectedness between them all. In fact, I'll take it further: there is no better way to live, and no more rewarding and no more necessary and no more morally and intellectually compelled pursuits for any human being.

Wednesday, 7 November 2018

Good Cop, Bad Cop Economics: Bad Cop

For some of you this will sting a bit, but it has needed saying for a while now. It is you, the left, that stand accused as being the principal cause of most of the economic problems in your home country. You are largely to blame for the state of the NHS, for the problems with social care, for the high unemployment levels in the under 25s, for the fact that too many people are doing university degrees, for the housing shortage, for much of the decline of UK industry, even for the rise in inequality in this country. Moreover, you are also largely to blame for Donald Trump, for Brexit, and for the rise of far right groups in the UK and across Europe.

Here is why you are to blame: your combination of wilful naiveté and yet strong opinions about the things mentioned above helps encourage the people that govern us to adopt all manner of foolishness on the basis that they think you will throw them out of office if they don't acquiesce. For decades you have been complicit in creating a political climate based on nonsensical arguments, shoddy counterfactuals and lazy myopia - fervently endorsing ideas and policies that are either based on factual misinformation, poor reasoning or idly choosing to ignore or overlook large swathes of the population that feel the costs of your decisions.

Take the most obvious case in point - the NHS. You have for years treated it like a sacred religion, and forced politicians to live with the lie that it can be sustained in the same way it was for the first four decades of its existence. Your belligerent demands for it to be safeguarded from proper market-based resource allocation has pressured politicians to ignore the supply and demand crisis, the costing crisis, and the fact that our living longer, our increased aging population, the increasing number of diagnoses and the increasing technological scope for medical advancement means it is no longer operating under the same framework under which it operated a few decades ago.

You have created this problem, by making politicians so terrified to respond to these realities that instead all they can do is resort to pathetic party political tit-for-tat squabbling about who puts more money in to the NHS and under whose leadership it would be less worse off. You have left them feeling like they have no option but to behave this way.

There are plenty of other cases where you have done something similar. For decades you have applied social duress on our politicians and made them believe that the only chance they have of carving out a political career is if they publically endorse a whole menu of economic foolishness that satisfies your beliefs. Where there is inequality, and people struggling to make ends meet, people struggling to find work, people in developing countries struggling to enter the competitive global marketplace, and people struggling to pay their rent or being able to afford to live in big cities - you must take a lot of responsibility for these things, because the truth is, the government you have been complicit in fattening up is responsible for pretty much all of these problems (and that is no exaggeration).

You have demanded that the political institutions that put up barriers to free trade become ever-bigger and more powerful; you have pressurised politicians to perpetuate facile price floors like the minimum wage; you have made more and more voracious demands on the earnings of the nation's most prodigious innovators and job creators; you have intransigently peddled the long-standing fairytale that the answer to most of the nation's problems are to be solved with higher taxation and more public sector involvement in our industries; and you have repeatedly promoted the economically toxic policy of domestic subsidies and the bailing out of financially deteriorating businesses and industries.

In short, you have pressurised the politicians of all parties into normalising bad policies - policies that increase unemployment and make it harder for the unemployed to obtain work; policies that stunt job creation; policies that make the cost of living higher for ordinary working people; policies that stifle growth and dissuade outside investors; policies that keeps regions in industrial atrophy; and policies that makes vital public services like health far more vulnerable to financial crises than they need to be.

For decades you have insisted on injudicious political ideas, and then demanded that the only people fit for governance are people who will enforce these ideas. And an equally bad (arguably worse) knock-on effect of this is that most of you let politicians get away with not having to proficiently justify their policies at a level beyond the superficially inane. You will almost never see a politician under even the slightest bit of pressure to admit the costs of a policy as well as the benefits, nor even acknowledge that most policies profit a small proportion of the population at the expense of a larger (unconsidered) group.

How preposterous it is that enshrined in our cultural climate is the habitual dismissal of majority groups affected by a policy, and the normalisation of anaemic economic arguments. And how sad that the politicians that govern us have to survive on spin, on popularity-mongering and by forever being afraid to admit their mistakes, or of having a public change of mind, or of introducing a prudent policy if it's unpopular.

You have played a big part in creating this monster and the concomitant lefty social commentators and politicians that feed on its body lice. And this needs to be at the forefront of your mind every time you open a paper or turn on the TV and see what you think is an injustice, or a group of people struggling to find work or live as comfortably as they could be. Most of the things you complain about and strive to fight against are creations of your own making. You are like Geppetto smashing up Pinocchio with a hammer, or like Victor Frankenstein taking an axe to the monster you've spent decades creating.

Fear not, though - the next Blog post that will be following this Bad Cop offering will be the Good Cop approach to remedying all that's wrong as per the above criticisms. I will offer a solution to how all this can be put right with a pretty radical but effective overhaul of our current framework.

Tuesday, 23 October 2018

On Evolution & Random Walk

In one of my most popular papers, I wrote about how the universe is governed by a biased random walk, giving us a Divinely choreographed mathematical skew that can create enough order to facilitate stars, evolution and life. Many of you have asked whether, in that case, biological evolution is also similarly governed by a biased random walk. Actually, nobody has asked that - but it’s one of the most interesting and intelligent questions a reader could have asked if they were on the ball with this, because it’s exactly the right sort of question to be asking.

As a reminder, a random walk describes a path derived from a series of random steps in a mathematical search space - so, for example, whether a drunk man staggers left or right with each step of the walk is entirely random (50% chance of left or right), as is his final destination after N number of steps. Using this model, if you mapped the drunkard at point A, and tried to predict his position after, say, 100 steps, you would not be able to deterministically predict his final location.
So let’s ask the question, then: is biological evolution governed by a random walk process? The answer is yes and no, but mostly yes. If all the proprietary parts in evolution’s mathematical space (which I’ve called ‘morphospace’) are all randomly walking through evolution with their distinct genetic drift and mutations, contingency says that, like a group of drunkards walking independently, we would expect them to have arrived randomly at different evolutionary endpoints. However, evolution is not a purely random walk process - and there are two reasons why: one is fairly simple, and the other is pretty complex. Let’s start with the simple one first, as discussed by Richard Dawkins in his book The Blind Watchmaker.

“I don't know who it was first pointed out that, given enough time, a monkey bashing away at random on a typewriter could produce all the works of Shakespeare. The operative phrase is, of course, given enough time. Let us limit the task facing our monkey somewhat. Suppose that he has to produce, not the complete works of Shakespeare but just the short sentence 'Methinks it is like a weasel', and we shall make it relatively easy by giving him a typewriter with a restricted keyboard, one with just the 26 (capital) letters, and a space bar. How long will he take to write this one little sentence?”

That describes what evolution would be like if it was a random walk process. But of course, it isn’t, as Dawkins is happy to acknowledge:

“We again use our computer monkey, but with a crucial difference in its program. It again begins by choosing a random sequence of 28 letters, just as before ... it duplicates it repeatedly, but with a certain chance of random error – 'mutation' – in the copying. The computer examines the mutant nonsense phrases, the 'progeny' of the original phrase, and chooses the one which, however slightly, most resembles the target phrase METHINKS IT IS LIKE A WEASEL. The sequences progress through each generation:








What this describes is what is called a ratchet effect (cumulative selection) where beneficial traits lock into place, rather like how card players get to keep their favoured cards after each shuffling of the deck. To expound on this, evolution requires four fundamental things to underpin the system.

1) Variation: there is variation in traits.

2) Inheritance: these variations can be passed on to offspring.

3) Differential survival(/reproduction): given the reproductive potential of most organisms, a population should be able to grow (this is not always what happens, of course)

4) Natural selection: those with heritable traits that make them more likely to survive by passing on genetic material.

The above example of METHINKS is not precisely illustrative of how natural selection works; rather it is illustrative of how cumulative selection can lead to rapid change over a relatively short period of time. The analogy was used to answer the criticism that there has not been sufficient time for particular structures to evolve by "random chance.” The analogy shows that random variation can lead to rapid organisation of structure, provided that there is selection for the structure. The analogy defends the 'rapid' part, not the 'selection' part.

Suppose a specific complex sequence, such as just the letters that make up the word 'METHINKS' corresponds to something complicated, like a human eye. The chance of hitting the sequence such as 'METHINKS' by fortuity alone is very small - 1 in 209 billion (That is, 1 in 26 to the power of 8, for this 8 letter sequence, drawn from an alphabet of 26 letters). Similarly, conjuring up a human eye out of nothing also has a vanishingly small probability, it might as well be zero. But, as I said, this is a poor analogy for evolution, because evolution acts as a 'ratchet', so when a correct letter clicks into place, it stays there (as indicated by capital letters), so it can achieve the target phrase in much fewer attempts, say 40:

1) 'sgfcsgo' ...

10) 'fETopcrS' ...

20) 'xETrINsS' ...

30) 'METoINKS' ...


Now the question is, doesn't there have to be an intelligence to compare the target sequence 'METHINKS' against the sequence that evolution is trying out, or in real terms, the comparing of the 'proto-eye' to the target eye that is evolving? Well, in evolution, intermediates give advantages, and when those advantages accumulate, like in poker when you keep the cards you need for a good hand and toss out the bad ones, more sophisticated survival parts are created.
By this model above, the first attempt corresponds to being totally blind. The 10th try might correspond to a patch of photosensitive cells, so the organism can know if it is light or dark. The 20th try might correspond to ridges forming around these cells, so they are in an indentation, and the shadows of the ridges could give some information about which direction the light is coming from. The 30th try could correspond to the ridges starting to close up, so the light comes only through a small hole, so that the organism has much better information about the direction of the light, like a pin-hole camera. The last, 40th try, could correspond to a lens forming over this hole, which focuses light onto the photosensitive cells, resulting in a high quality image.

The point is that 1% of an eye is better than no eye, and 50% of an eye is better than 20% of an eye, and so on. At all stages, this extra light information available to the organism improves its survival value, and so the genes for making 1%, or 20% or 80% or whatever, is preferentially passed on to future generations. So, it's not as if an intelligence compares 20% of an eye to a complete human eye, and said 'ahh, this is better than its cousin, with 15% of an eye, I will let it pass on its genes for making this eye', but simply that when a predator comes along, it will see it before its cousin sees it, so its cousin will get eaten and not pass on its genes for making the 'inferior' 15% of an eye, but the 20% of an eye individual will pass on its genes. Of the offspring, some might have 19% of an eye, others might have 21% of an eye. Then the 21% of an eye will be more likely to survive, and its offspring might have 22% of an eye, and so on, all the way from humble beginnings until a complete, complicated and accurate eye is formed. The target sequence above merely corresponds to something that aids differential reproduction.

Having established all that, here is where we get to the more complex considerations, because underneath all that is a highly complex mathematical picture, which gives us another way to consider random walk. What’s happening in biological evolution underneath that layer is that there is a huge biochemical morphospace that has a connected structure through which evolution’s reducible complexity can traverse. Take, for example, irreducible complexity and reducible complexity - they refer to the arrangement of stable organic structures in evolution’s ‘morphospace’, but they cannot most primarily be understood at the level of the organism, because morphospace is not an adaptive landscape where we visualise the relationship between genotypes (or phenotypes) and reproductive success, and model fitness on the height of the landscape. Morphospace is a mathematical model of the form of connectivity between patterns – so a reducibly complex morphospace means that the biological structures that populate the evolutionary landscape form a connected group.

You may think of the system as being like a gigantic sponge made up of very tiny fibrils that connect the evolutionary structure together. If the connection has no broken off parts then the random walk of evolution can move across the whole structure. In fact, this is a particularly good illustration because sponges are composed entirely of mobile cells which can move about between different layers of tissue and reallocate themselves to take on different tasks. Sponges have totipotency, which as you may know, is the ability of a single cell to divide and produce all the differentiated cells in an organism. This allows any fragment of a sponge to regenerate into a self-sustaining organism.

So biological evolution is random walk, but as I said at the start, it is not entirely random walk. Firstly, because the ratchet effect locks beneficial mutations in place, but secondly because although the biochemical engine of evolution is underwritten by probability envelopes concerning whether a particular genetic trait will be passed on to subsequent organisms - and that, at least in terms of the mutations themselves, does approximate random walk - there are sufficient constraints on the system to bias the model in favour of order.

If we take an evolutionary starting point and then generate a random walk on the organism, then the probability favours random walk statistics (in formal terms,  a Gaussian probability distribution across the search space) - a probability curve in the shape of a distribution graph with no evolutionary biases. In the actual evolutionary landscape, though, what happens is random walk plus incremental variants in the search space; that is, we see a bias in the system that conforms to the ratchet mechanism of natural selection’s operation for fitness.

What you have to remember is that by the time we get to the level of order in the universe that contains our planet’s chemical substrate, the majority of the cosmological groundwork has already been done. It’s a bit like showing a movie at the cinema: your viewing pleasure is the tiny end part that succeeds all the planning: the screenplay, the casting, the rehearsals, the production, and the filming that went into making the movie. In mathematical terms, biological evolution is like showing an ingenious movie at the cinema that God has already written and produced - what we see is the most accessible elements of a complex creation process that involved ingenious twiddling of the mathematical laws to eventuate in a biochemical random walk substrate on which biological life can flourish.

In terms of evolutionary genetics and inheritance, our movie-watching lens of analysis looks like probabilistic search space of numerous configurational possibilities which generates successful survival machines, with genes using bodies as vehicles for propagation, many of them outliving their hosts by millions of years. Evolution, then, has an isotropic random walk directionality, but a relatively constrained search space in terms of the four fundamental underpinnings I mentioned earlier (variation, inheritance, differential survival and natural selection).

So, in simpler terms, going back to our group of drunkards on a random walk - if they all live in the same apartment block, the neighbourhood in which they are walking has enough limitations on the road and path structure, and they meet enough friends along the way to gently nudge them on the right course, and both those things mean that they have more of a chance of all arriving back at the same place.
The kind of biological evolution we see from those cinema seats can only work if the randomness of the mutations plays out within a very small probabilistic search space - and the groundwork for this was already done when the blueprint for the universe was written into the laws of physics. By the time the second law of thermodynamics gets scrambled into action we have an intricately directed form of entropy: where biology is organised under the constraints of the information implicit in its machinery, and at the same time still remains within the ordinances of the second law of thermodynamics.

Wednesday, 17 October 2018

The Mathematical Bias Theory Redux: Why There Probably ‘IS’ a God – in 20 Steps

This was written about 8 years ago, and published as a Blog post about 6 years ago. It is the summarising thesis that makes up a whole book of material I've written on God, Philosophy and Mathematics. Today's Blog post is the newer, redux version - published today because it contains a few additional analogies and clarification points that should supplement the original work and amplify its key tenets.

Look around and you and you’ll see a plethora of dubious theology and pseudo-science centred on Creationist ideas and Intelligent Design movements. I reject Creationism and Intelligent Design as being fabrications of the truth. What’s often missed, though, is that real, authentic science gives (I think) some kind of exhibition to the Divine Cosmic Mathematician behind the law and disorder of the cosmos. So here are a few ‘back of the envelope’ style jottings on why I think there almost certainly is a God.  I’ve decided to call it The Mathematical Bias Theory: Why There Almost Certainly ‘IS’ a God – in 20 Steps

In science we don’t start by assuming we have all the answers on a plate ready for easy consumption – we spend our time bringing together information and ideas on how to assess variable and diverse protocols, and we work hard to bring them into exquisite theoretical descriptions.  To this end, and through a particular lens, science is descriptive inasmuch as it is about deciphering mathematical patterns that are imposed upon the substance of the cosmic order.  There are many facets to this deciphering that remain too complex or too multi-dimensional for a full cognitive purchase, particularly when we talk of the deeper scientific questions.  Even the complex patterns generated through our observations in, say, quantum mechanics are so complex that they only permit statistical descriptions.  So, naturally, statistical descriptions are human constructions that approximate a reality ‘out there’ – and they can only be considered accurate to the extent that we can formally conjecture about them and create models and labels to communicate them. 

Various proposals have been put forward by physicists about descriptions of nature; there are speculations about string theory, M theory and other conjectures about multi-dimensionality; conjectures that at sub-Planckian levels the universe has no dimensions at all and is just an arrow line.  We’ve had different conjectures about what spacetime is, the nature of gravity, non-linearity effects in spacetime, a geometric duality that reverses linear dimensions and undermines spacetime, theories about the true nature of particles and waves, or differing kinds of energy and mass relative to differing speeds – the list goes on.  All those examinations have one thing in common – they won’t tell us if there a Divine Cosmic Mathematician underpinning it all, because they are heuristics that deal only with the descriptive aspects of nature’s law and disorder.  As the last few thousand years of science has shown, our heuristics are almost always subject to augmentation with further knowledge and technology.  Most importantly, though, descriptive science cannot eliminate the burden of contingency related to the ‘Why is there Something?’ question. 

I’ve said all of the above for one good reason; grand theories that explain reality will not take the form of descriptive physics – they will take the form of a conflation of mathematics and philosophy, because both those subjects bootstrap our physical descriptions - and physical descriptions of reality are not complete, as they only simulate possibility.

What I’m going to say isn’t one bullet-pointed proof of God – it is a picture of a worldview that suggests to me that the cosmos is designed by a Cosmic Mathematician.

1) If mathematics is the language to describe the signature of God as some sort of Cosmic Mathematician, then nature can be modelled by some sort of mathematical template or blueprint that deals purely with constitution in numbers. This is because when seen through the mind of omniscience, nature as we know it and engage with must be amenable to statistical description if concepts like laws, patterns and information are to mean anything. For this reason, given that at a simple human level mathematics is the language we use to embed conceptual reality into patterns of description, I can conceive that a complete and totalising description of reality in the mind of omniscience will be (at least in one form) a complete set of information that consists of mathematical pattern storage.

2) As nature is reducible to bits of information, then to an omniscient mind (with no gaps in knowledge) the whole cosmic spectrum of law and disorder is computable - so even though omniscience has other forms of complex conception that we cannot grasp, we know of at least one way to describe nature in that way – a description of pure pattern storage.

3) From points 1 and 2, a fairly obvious corollary follows. Nature provides us with a form that is descriptively compressible. But descriptive science can only compress so far - we reach a point at which our road to compression hits a conceptual brick wall. Further, even compression doesn't tell the full story because each physical compression requires an algorithm, so the ultimate compression of our cosmos must involve algorithmic precursors to enable compression, so whichever way we look at it, we seem to be faced with a reality that ‘just is’ – and that seems like a miracle. 

Note: Mathematical compression is not to be confused with the reducible complexity found in the material substrate.

4) Given that information is measured using probability, and probability doesn’t have a negative value, the formal structures of data compressing equations and algorithms must always return ‘something’ not ‘nothing’ – so we can’t reach the point of reducing or compressing the universe out of existence or ‘to nothing’ anymore than we could compress one of Hooke’s springs to zero. 

Note: Most complex forms can only be converged upon algorithmically if either the algorithm is executed for a period of time far beyond the capacity of any human or computational machine or if we had access to the initial precursory conditions.  There’s no way of escaping it - given what we know of our universe, those precursory algorithms would have to be alarmingly complex if they underwrite our cosmos, because we know how alarmingly complex our cosmos is, even in its most elemental statistical descriptions.

5) When it comes to ultimate explanations, complexity only comes from precursors that are also at the upper end of the complexity spectrum. That's not to say, in the simple physics of our universe, that with a long execution time simplicity cannot produce complexity, because it can - but that's not a satisfactory 'ultimate' explanation, because it fails to eliminate the burden of contingency, and it doesn’t leave us with a plausible ‘just is’ closure – it only relates to mathematical patterns ‘within’ physics. An algorithm posited as an ultimate explanation must be scrutinised to provide a reason why it exploits a principle that is algorithmically ordered at all - and so, we are left with a multiple regression of 'why?'

6) At the heart of “something” the mathematical configurations must be complex, because at every instance we are always left with complex brute fact algorithms. At the very least we know that any bootstrapping algorithms must have complex blueprints because we know for a fact that this universe has an incredibly complex blueprint. In fact, the algorithm that underwrites the cosmos may well be as long as the cosmic data itself, and that won’t just pop up out of nothing.

Note: Considering the patterning view of randomness - it is a dynamic that produces a sequence, and this could be anything from a book of random numbers, through to a computer printout, to the heads/tails sequence of coin tossing. Hence if we have, say, a coin that we continue to toss, as far as the patterning notion of randomness is concerned, the eventual sequence of heads and tails stretches out producing a pattern notion that has a denumerable (in other words, ‘countable’) set of possibilities available to it, and so we know that the sequence generated by the coin tossing will assume one value taken from a countable set of possibilities, it’s just that we don’t know which one!  The patterning view of randomness sees that the ‘to-be actualised’ possibility is simply an unknown pattern stretching out into the future before us! This is configurational randomness; it is a rigorous mathematical description of what our intuition tells us are ‘untidy’ and complex sequences of 1s and 0s.  So, a configurationally random sequence is a particular class of pattern.

7) Given the foregoing, the universe and all its laws are bootstrapped by complex algorithms, and as complex algorithms of that order will not just pop up, nor are they intelligible at all unless they are reified on an up and running sentience, there seems to be a senselessness without a mind to reify them, because patterns are meaningless without a mind to interface with them. What this hints at is that the universe is endowed with a network of computation that is itself only intelligible if 'mind' is at the core of that intelligibility. It appears very plausible that complex sentience bootstraps the kind of universe we find ourselves in, and in an extraordinary way, our minds make everything intelligible by reifying those concepts. To that end, the relationship between mind and mathematics can be regarded as being extraordinarily ‘hand in glove’.

8) In our universe of compact and neat physical laws we can conceive of a type of data compression, because it is the ordered patterns that make it amenable to compression.  However, like all data compression, there comes a point when no further compression can take place, so we are left with this problem of what I call an ‘is-ness’ that just won’t go away and cannot be removed from the burden of contingency. That is to say, I’ve said that those compact and neat physical laws in our universe cannot be compressed to a mathematical zero, but what of the patterns that provide the compression for those laws? 

9) In the morass of disorder those highly compressed compact and neat physical laws would be highly unrepresentative patterns in that configurational system. Pictured mathematically, what we have is a mathematical generating system that generated mathematical configurations which tended towards maximum disorder, yet embedded a constraint on itself to produce the order of stars, planets, life, and minds that would go on to understand concepts (including those of God) with high level self-awareness.  This wouldn’t be unreasonably construed to be giving exhibition to conscious sentience creating and sustaining the cosmos in its vast mind.

Note: In trivial and simplistic form, most of these algorithms are counting operations which involve systematically sifting through a search space of all possible permutations of characters.  Whichever way we look at it, whether from a theistic or naturalistic perspective, knowledge of the entire cosmos would bring with it a system with a permutation of characters that effectively holds the data describing our cosmos – and this will consist of a finite map of information, with a scale of order and disorder, and this what we are looking at here.

10) Logical incompressibility has to do with equations and algorithms used for data compression. Although in mathematical terms it is true that physics effectively defines a set of stable structures in morphospace (where morphospace = the richly ordered mathematical configurations that facilitate biochemical life) - with the random walk of morphospace and physics, combinatorial space has a huge class of possible configurations which simulate many other alternative possibilities embedded in the mathematical potential. 

Note: Combinatorial space is the level at which something is computationally complex – and hereby refers to the space of possibilities that are unconstrained by an evident set of mathematical laws and constants.  

11) Clearly amongst the class of every cosmic possibility the overwhelming number of configurations in the cosmos tends to more disorder than order. Inside that class, the complex ordered configurations of life has a representation as very very very negligible (1 in 10 to the power of many many many trillions).

12) Even if we are the only life in the entire cosmos (that is doubtful), and our history does appear somewhat cosmically fortuitous, this outcome (when compared to the vast number of possible configurations that tend towards disorder) is a vast over-representation of an otherwise very unrepresentative class of configuration in probability terms. So instead of wondering why the universe seems so life-unfriendly, the question is rather; why does the cosmos have this extraordinary mathematical bias that allows it to facilitate any kind of order at all?

13) We know that the universe is expanding. Expansion is part of a universal principle from the maximum order of the big bang (zero entropy) to greater and greater trends towards disorder as spatial expansion increases, and with increased velocity space, maximum entropy is everything we should expect from an expanding random walk universe. Yet against all odds we have a biased random walk universe capable of creating living things with low entropy. And the astonishing thing is this; the only time in the history of an unbiased random walk universe that we should expect to see the sort of low entropy we see in systems like life, or even planets and stars, is at the point of the big bang. After the initial expansion of spatial space and increase in velocity space in an unbiased random walk universe, we should see less and less chance of ordered microstates like stars and planets with every increasing passing of time. But the mathematical equations that govern our universe are not the same equations that govern random walk; and that is the mathematical bias that shows us that Cosmic Sentience is behind the equations.

An analogy: Suppose by way of analogy that the most ordered state of the universe, the point of the big bang, represents a book in its most ordered state - let’s say Shakespeare's play Hamlet. In this state the book is at its most ordered in terms of grammar, text and thematics. Now imagine all the text of Hamlet gets plugged into a computer program that breaks it all apart into a less and less ordered state, and does so at an accelerated rate too. So after the first few minutes, acts and paragraphs have been separated, sentences have been broken into pieces and the thematics have become more and more fragmented. Then the disordering process accelerates so that after a while longer mere words are broken up into letters, and the book becomes a random configuration of letters, maxing out at a level of disorder than contains no structure of words whatsoever.

This is essentially what the universe is set up to do after the big bang - as matter stretches out, and the combinatorial search space of possibilities for order becomes more and more diffuse, the properties of hydrogen and helium become less and less like acts and paragraphs, and more and more like fragments of words. Like the text of Hamlet subjected to the computer program that increases disorder, there is less and less likelihood of the kind of order you see in stars, planets and life. The second law of thermodynamics allows for local decreases in entropy as long as those decreases are balanced out by an increase in entropy somewhere else: but the bias appears to be that the otherwise unbiased nature of random walk is not unbiased, but is, in actual fact, a biased random walk, which is somewhat remarkable beyond remarkable. In fact, if you don't understand how remarkable it is, you don't understand it at all. It would be a bit like running the computer program with Hamlet text, increasing the disorder of the linguistic configurations to blow apart all the acts, paragraphs and sentences being retained in the configuration, and then finding somewhere down the line that the program had thrown up whole paragraphs of Macbeth, Othello and Romeo and Juliet. The universe's biased random walk behaves like a computer program that started with Hamlet, began to break up the text into greater and greater disorder, but has throughout the execution time been manipulated by instructions in the program to obtain formations of entirely new Shakespeare plays later on in the process. That is the bias in the universe.

Note: I must bring to attention one common misconception about how nature behaves with regard to thermodynamics.  The second law of thermodynamics says that in a closed system disorder increases with time, but some people would likely disregard the possibility of the huge mathematical constraints I am talking about by pointing out that amongst the tend towards disorder when one bit of the system becomes quite ordered, there will be an exhaust of disorder elsewhere to offset the decrease in entropy, and that the overall effect still produces higher disorder. This is akin to saying that because thermodynamics is very complex low-mass and low-speed Newtonian contingency barrier on general relativity, and that because there is an overall increase in disorder to compensate for the pockets of order, that this somehow relegates the postulation of a mathematical bias down to the realms of pure speculation.  In a moment we will see why this isn’t true.

14) Why do we have a universe with laws that ought to tend heavily towards maximum disorder (and do in most cases)? The reason for this is fairly straightforward; at an atomic level thermal energy has a diffusion that arranges mass in random motions causing an increase in disorder. But that doesn’t mean that increasing entropy necessarily corresponds to increased complexity due to the random arrangements of mass and energy – in fact, it is a mistake to equate simplicity with disorder because there is a vast degree of complexity in highly disordered random systems because their complexity is such that they contain vast numbers of cases in which they are not amenable to a simple mathematical system. 

Note: When it comes to means and frequencies in mathematics, even a highly disordered system is configurationally complex in that it contains a lot of complex data. Even in the evolution of life, in the mathematical sense phenotypical organisms are configurationally not maximally ordered or disordered, and this means that high order doesn’t necessarily entail high complexity. 

15) We have a universe with laws that ought to tend heavily towards maximum random walk disorder but that also contains an astonishing mathematical skew in its emergent order towards stars, galaxies and planets, and the eventual facilitating of genetic algorithms, conservation of sequence and function in biology, and maximisation of fitness of those organisms. This alone tells us that there is such a constraint provided by the physical regime of laws.

Note: Obviously the distribution of energy in the universe doesn’t tend towards maximum disorder – if it did there would be no thermal energy and chemical energy to produce stars, planets and life.  Once we get to the stage of the emergence of biochemical life and the point at which organisms begin to evolve and eventually pass on their genetic material we can say that the active information in the laws of physics has leaped over a significant hurdle, because bit by bit evolution achieves progression through the system of cumulative ratchet probabilities

16) We can see that in less localised terms the laws of physics impose order on the universe in that the physical model has many possible states, but regulating laws limit those possibilities.  In actual fact, the question of why we have any order at all in the universe is a very worthwhile one, so the fact that we have an order of the magnitude that produces stars and planets is quite astounding.  The problem is that most people think of stellar explosions and the seemingly happenstance occurrence of the formation of the planets in our solar system and see a pretty chaotic and disordered mess, thanking their lucky stars (literally) that we ever got here at all.  On one level this is acceptable, but in truth the sort of mathematical skew I am talking about makes even that seemingly fluky activity incredible.  Here’s why.  Given that the universe should be heavily tending towards maximum disorder, even something like the emergence of stars requires an extraordinary restriction on the laws to facilitate the cosmic bottle neck to eliminate every other possible state to see that such a facilitation occurs.  This cosmic frontloading should under no circumstances be such that this kind of order should ever occur, because a universe without a mathematical bias would run down to maximum disorder very freely. 

Note: As I’ve said once we get to the earth’s biochemistry the appearance of another bottleneck is easier to reconcile because with biochemistry the severe constraint on the space of possibilities has already limited the possibilities and produced an increased ratchet probability where the statistical weighting favours the probability of life and not maximum disorder. 

17) As an illustration concerning combinatorial space, consider that quantum mechanics is all about measuring probabilities.  In quantum mechanics the wavefunction is a single-valued function of position and time, and it is a model we use to support a value of probability of finding a particle at a particular position and time.  Even concerning one particle we have a complex conjugate because specifying the real physical probability of finding the particle in a particular state involves a fairly broad search space.  Search space is best seen as a metaphor for our representing the probability amplitude for finding a particle at a given point in space at a given time.  Now imagine the complex permutational variables in a cosmos that has been expanding for nigh-on 14 billion years.  That is a lot of information and an incredibly vast search space of possibilities.  For the conditions of any order to be met, the laws of nature must preclude so many degrees of permutation (trillions of trillions) that far from physical probability being even diffused throughout nature, it must heavily constrain the laws in favour of non-maximal disorder (let alone biochemical life) to the following extent; that whether one believes in a personal God or not, the fact that the cosmos looks blueprinted for life is impossible to deny. 

18) As a second illustration; consider in biology one of those tiny gradual steps up evolution’s mount improbable.  Yes, it’s an accumulation of bit by bit selection, but that doesn’t tell the whole story - for even one very simple beneficial mutation which is just one small step in a long evolutionary history is itself woven into a huge fabric of other possibilities, and is just one tiny part of an incredible bias that drives the laws of physics towards life – a bias already embedded in nature and that is required to severely reduce the size of the cosmological search space by providing what seem to all intents and purposes carefully blueprinted generating algorithms that produced an information-rich universe set up for life.

Note: What the second law of thermodynamics does is produce a random walk across all possible states, and settles on the state that the statistical weight skews it towards.  In other words, the second law facilitates a migration based on huge statistical weighting whereby the skew directs a system towards its most probable state. As a simple illustration, if I turn on a gas flame, the statistical weighting does not tend towards heat all staying in the same region, it tends towards diffusion into colder regions away from the flame’s output.  The reason being is that the colder regions provide far more microstates (that is, possible combinatorial search spaces) for the diffusion to arrange itself in than the hottest regions, so the heat tends towards regions with the greatest possible search space. 

19) On a greater scale, what we are seeing is thermodynamically optimum diffusions, but under the constraints of the laws of physics, and as a mathematical pattern we are seeing this right through nature’s blueprint.  Many make the mistake and say ‘Well in a universe the size and age of ours we are bound to have the occasional cosmic fluke that then goes on to produce stars and (if we’re ‘really’ lucky) planets and life’.  But such a claim shows that the person has a misjudged understanding of the subject at hand, because the very very tiny number of ways of locking in to order are not do with serendipitous moments of cosmic fortuity that just happen to throw up the odd fluke, they are to do with the enormous unlikelihood of having laws that constrain a system enough to produce anything other than maximum disorder - that is what is so remarkable. 

20) I fancy that a universe without a designer would be nothing like the universe we see – it would be maximally disordered and we would not have ever been born to talk about it, because a physical regime where disorder is unconstrained by a mathematical bias wouldn’t produce any biological evolution at all.