On The Origin Of Zero

Share on FacebookShare on RedditTweet about this on TwitterShare on StumbleUponShare on TumblrPin on PinterestEmail this to someonePrint this pageShare on LinkedIn

Nothing hereSomething, to deliberately show nothing – that is the odd irony at the heart of zero – it is almost unbelievably useful. It makes algebra, computing and Pixar films possible, or at least easier than they would otherwise be. Think of the number 2013, that zero tells us where the other numbers are, and therefore that the 2 means 2 thousands, not 2 hundreds or 2 millions. It sounds simple, but it was such an elusive idea that European Mathematics never invented it, zero was imported.

Egyptian and Greek mathematics was all about measurement and calculation of what existed. Volumes of spheres, areas of land, heights of buildings and pointiness of pyramids. Geometry and measurement ruled. With such a tight link between maths and reality, the question of ‘nothing’ never came into it.

Over 1000 years of mathematics extending back from medieval Europe to the Egyptians didn’t uncover it, but zero came. Once introduced, the Christian Church and Governments fought to keep zero and the new numbers in check for over 100 years. Zero came in with the numbers and decimal system we use today, the ‘Arabic’ numerals. They came from India.

The Home Of ‘Nothing’

Heere be Bodhisattva

India soon mastered multiplication.

Mathematics and religion are entwined in India’s religious life. Hinduism, Jainism and other ancient Indian religions treat numbers as a part of religion, and so needed versatile numbers. How many stars in the sky? How many grains of sand on a beach? Astronomy has a long history in India, and it was but one of the reasons for needing large numbers. In response to the need, the decimal system emerged, and large numbers were described in almost excruciating detail.

There are long religious passages in which Buddha describes the numbers up to 10421, a number larger than the total estimated number of atoms in the universe (1080). In fact it is a number so large that it cannot be applied to anything that exists. Jainism, a sister religion to Hinduism, has its own unit called a palya. A palya is the time it takes to empty a cube 10km across, filled with wool, if one strand is removed every century. There was more than just grasping at colossal numbers, nothingness had its own focus as well.

Jainism believes that the ultimate aim of being is to remove any wants, desires, and urges to alter the workings of the Universe.To not affect, to let it be. The aim is to become nothing. Similar ideas of restraint and asceticism define many Indian religions.

The focus on nothing and numbers meant that zero appeared, inevitably. In 628AD the rules for dealing with zero were laid down by the Astronomer Brahmagupta, and we still use them to this day.

Zero Rules

You probably know how to deal with zero. “There’s nothing to it!” you may say, and you would be correct. Brahmagupta’s rules are still used today, and are very simple. Here they are, written so that ‘a’ represents any number:

a – a = 0

a + 0 = a

a – 0 = a

a * 0 = 0

-a * 0 = 0

0 * 0 = 0

With these rules zero became a true number, and opened a world of possibility. The number line crept over zero and plunged into negative numbers, broke up into irrational numbers and a new world order was established. There was a mistake though, one vital misstep that is with us still today. Brahmagupta tried to divide by zero. You never divide by zero.

Dividing By Zero, Not Even Once

Brahmagupta was wrong about dividing by zero. 10 ÷ 5 = 2, 10 ÷ 2 = 5, 10 ÷ 1 = 10. From the sums I just expressed, you can see that the smaller the number you divide by, the larger the answer becomes. So, divide a number by zero, and the answer must be infinity. This can’t happen because infinity is a concept, not a number. It isn’t the result of any number that can be counted. Then there is the fact that dividing by zero is just any idea that doesn’t make sense. Like trying to make oatmeal cry.

Division is really a form of subtraction. 42 ÷ 2 = 21; here you are just asking ‘how many times can I subtract 2 from 42?’ When I comes to zero though, this becomes nonsense. How many times can you take nothing from something? That is a question with no proper answer. Try it in a calculator and you come up with an error, because it can’t be done. Trying to split something into units of nothing is a big mistake. If you let yourself divide by zero, maths breaks. Let me show you.

Proof that 1 = 2 (for all finite numbers)

(a2 – b2) = (a – b)(a + b)

Lets make a = b, baby steps.

(a2 – a2) = (a – a)(a + a)

To simplify, we rewrite “(a2 – a2)” as “a(a – a)”

a (a – a) = (a – a)(a + a)

The rules say “(a – a) = 0,” so lets replace the confusing letters.

a (0) = (0)(a + a)

Now we have “0” on both sides, so we can DIVIDE BY ZERO

a = a + a

a = 2(a)

so lets now remove ‘a’ from both sides of the equation

1 = 2

lovely

Dividing by zero doesn’t make sense, and pretending it does breaks mathematics. Risks and confusions included, it was still better a better choice than the reigning system in Medieval Europe, Roman Numerals.

Rotten Roman Numerals

Until the 13th Century, Europe was still using repetitive, inefficient Roman Numerals. To write 3333 they needed to write MMMCCCXXXIII. Multiplication was a pain and division a nightmare as writing results to calculations became unintuitive and clumsy.

The new numbers, including zero, could write any number, without modification, whereas Roman numerals could only go up to 3888 before needing extra modifications on the symbols. The power of the decimal system, was that the value of a number depends on its positions.

For 209, the number tells us we have 2 hundreds, 0 tens, and 9 units. Zero tells us the difference between 29, 20900 and 209, giving us the place and value of each digit. Zero is the reference point, showing us the true size of things. Without it, you would just have to guess if ‘1’ meant 1, 1000, or 1,000,000. When it got to Europe, things didn’t go well for zero. There was a dispute.

Much Ado About Nothing

A young Fibonacci. Source: Wikipedia

A young Fibonacci [Wikipedia]

In 1202, the new maths entered Europe on the shoulders of a great book. Liber Abaci (The Book of Calculation), was the first Latin text-based on Arabic mathematics to explain the new system. It was the work of a young merchant in modern-day Algeria, Leonardo Fibonacci. It showed how to write any figure in the new symbols, and made multiplication among other things, far easier to achieve. The Christian Church was opposed.

A fourth holy crusade was being prepared on the holy city of Jerusalem, anything remotely from Islamic areas was fear and reviled. These new numbers, with their suspicious convenience and odd zero could be the work of the devil. They were slippery as well, the church argued. 1 could look like a 7 and 0 could be misread as a 6 or 9. In 1299 the government in Florence banned use of zero, yet still it spread.

Merchants and Bankers secretly began to use the new numbers. Zero marked the point of breaking even, when outgoings matched incomes, something difficult to express in Roman numerals. Though banned, they used it in coded messages which have left their mark on the English language.

From the original word ‘cifra’ English gained not only ‘zero’, but the word ‘cipher’ meaning code. A linguistic relic from the secret movement of mathematics across Europe. So it remained until the 15th Century and the marvel of movable type.

Pressing Issues

Arithmetica, spirit of arithmetic, shows her preference for the new numbers with an adoring gaze. [Wikipedia]

Arithmetica, spirit of arithmetic, shows her preference for the new numbers with an adoring gaze. [Wikipedia]

The printing press was invented in 1450, and there was immediately a decision to make. What numbers to use? Hindu-Arabic numbers were saved, and largely due to efficiency. To write most numbers, Roman numerals required more digits, more time and more ink. The new numbers spread, and zero multiplied, finally gaining the lovely round shape we are familiar with today.

By 1550, the new numerals were the dominant number system across Europe, and mathematics was once again in growth. Without the tight links to reality Europe managed to create graphs, calculus, quantum physics; and, eventually, Finding Nemo.

Today zero is everywhere. Whether the 0s and 1s in our computers, or the origins for out graphs; the calculations for building bridges or the algebra students have to solve in class. We live in a zero world now, but we are still unsure around the number.

Is it positive or negative? Can we divide by it? Is it odd or even? Much confusion surrounds zero; just don’t divide by it, and we’ll all be fine.

Sources:

Images listed without sources are the product of the author.

Share on FacebookShare on RedditTweet about this on TwitterShare on StumbleUponShare on TumblrPin on PinterestEmail this to someonePrint this pageShare on LinkedIn
Deployed on by Alexandre Coates in Macro Oddities 4 Current Replies

4 Responses to On The Origin Of Zero

  1. Pingback: All Is Number: An Interesting History of the Pythagoreans

  2. Kirsten Elliott

    I think you’ll find that Adelard of Bath beat Fibonacci to it. His books on mathematics were writtin circa 1130.

     
  3. Kirsten Elliott

    Sorry – wrtten.

     
  4. Pingback: Was Jesus Real? How do we know he really walked on earth?

Add a Comment