"Beauty in mathematics is seeing the truth without effort."

-- George Polya

If you think that mathematicians are emotionless automatons, humorlessly crunching numbers all day, well then you've obviously never met one. On average, they are some of the most poetic people on the planet. They use words like beauty and elegance with the same hallowed tones others save for prayer. Conversely, they revile the ugly.

A well-crafted proof is considered a paradigm of beauty in their world. Sometimes, a beautiful proof can overshadow what may be an ugly result. It's all very romantic and difficult to explain if you aren't one of them. I exist somewhere on the fringe, which will allow me to continue this post. You see, I fear that the constant progress of computer science threatens to crush the poetry of the proof. As the machines grow more powerful, the ability to attack problems with brute force will be much increased. Mathematicians aren't going to like this.

For example, the famous Four-Color Theorem is commonly known as the first major proof to be done almost entirely by computer. Imagine a map, say of the United States. How many colors does it take to color in all the states so that no adjacent ones are the same color? Now imagine an infinity of maps. How many colors will it take? The answer as it turns out is four. Essentially, the infinite number of maps can be reduced to a more manageable number, because a lot of them are just variations. Then those basic map types were fed into a computer with a coloring algorithm and it went to work. It reported that it could color all the maps with only four colors. Case closed.

Only it isn't for a lot of people. For one thing, humans do not live long enough to check all those maps by hand. We have to trust the computer executed its algorithm correctly and that it was given an accurate algorithm in the first place. But mostly it is disliked because it is ugly. Now I can get to the crux of the matter.

Computer technology is going quantum. Most everyone is familiar with the binary nature of digital computers. The word bit stands for binary digit. Everything a computer does, it does by simplifying things to ones and zeros, to yes or no. In binary logic, the gate is either open or closed. In the immortal words of Yoda, "Do or do not, there is no try." With quantum effects, things get a bit muddier. Without going into to much detail, the weirdness that is the quantum world breaks the law of the excluded middle. Superpositions are allowed, gates which are both open and closed. Quantum computers take advantage of this phenomenon to perform calculations at the same time, rather than in rapid succession like current supercomputers. In fact, a company has recently announced the release of the first practical quantum computer. It is a alleged to have 16 quantum bits or qubits. This would allow it to handle 2^16 pieces of information simultaneously. There is a lot of suspicion around this one though. It is at least fifty years ahead of the estimated state of the technology and the developers are being very secretive. Time will tell. But even if this one's a hoax, it will happen eventually.

How will that computing power affect things? Suddenly uncrackable theorems will fall left and right if we are willing to trust the computers. Public key encryptions that use the CSA method based on large prime numbers will be broken in seconds. The P vs. NP issue will be moot. The Traveling Salesman will have his answer as he turns the ignition. (For a detailed explanation of what I'm talking about, check out Rebecca's blog.)

Will mathematicians revolt? What will happen to the elegance? I guess we'll all just have to wait an see.

## No comments:

Post a Comment