Science in Society Archive

Book Briefs

Quantum Computer? Is It Alive?

The Feynman Processor, Quantum Entanglement and The Computing Revolution, by Gerard J. Milburn, Perseus Books, Cambridge, Mass, 1998, ISBN 0-7382-0173-1. Dr. Mae-Wan Ho reviews.

My father has become an internet user at age 81. He sends me messages containing the latest digital family photographs, so I can see how he is putting on weight and regaining his health after a recent illness, and I can reply to tell him so immediately. He likes e-mail because it is so much faster than airmail and he can reach me when he doesn't know where I am. He would be astonished to hear that scientists are still trying to make computers go even faster as well as smaller. And that we might be able to communicate faster than light using a quantum computer.

At the end of 1996, Intel Corporation, in collaboration with the United States Department of Energy (DoE) announced the first 'ultra-computer' to reach one trillion (10 to the power 12) operations per second, or one 'teraflop'. It cost $55 million. The full system consists of 76 large cabinets with 9072 Pentium Pro processors and nearly 6 billion bytes of memory. The ultra-computer is the product of DoE's Accelerated Strategy Computing Initiative (ASCI), and would ultimately reach a peak performance of 1.8 teraflops. ASCI is a ten-year program to move nuclear weapons' design and maintenance from being test-based to simulation-based. Were it not for the ultra-computer, Clinton would not have been able to sign the Comprehensive Test Ban Treaty on 25 September 1996.

But could a computer simulate reality perfectly? Is it possible that the DoE's confidence in the ultra-computer is misplaced? The quest for yet faster computers did not stop there.

IBM's computer, Deep Blue, defeated world chess champion Gary Kasparov in 1998, by sheer force of speed in checking the possible moves ahead. It can calculate 50 to 100 billion positions in three minutes, the time allowed for moves in major tournaments. It was a landmark victory, but it is clear that Deep Blue won not because it is cleverer than Kasparov. A computer just isn't ever clever, at least, not clever enough. There are lots it can't do. It doesn't know to laugh at jokes, or feel sad. It can't even walk into a MacDonald's on the high street to order a hamburger.

And it will be 2005 before a computer will attempt to simulate a protein molecule folding into shape. A new supercomputer, Blue Gene, costing $100 million, will be equipped with SMASH (simple, many and self-healing), which will dramatically simplify the use of instructions carried out by each processor. Instead of a single microprocessor on a chip, Blue Gene's chips will hold 32 processors and about a million microprocessors, so it will perform one quadrillion (10 to power 15) operations per second, or a 'petaflop'. Even then, it will take a year to simulate an average protein folding, a process that's complete in split seconds in the body.

So, speed doesn't make up for the fact that reality may be quite different, and works on different principles.

It was Alan Turing, inventor of the universal Turing machine, the direct precursor of the modern computer, who first cast doubt on the computer's ability to simulate reality. Turing proved, devastatingly, that the Turing machine can't tell whether it can produce an answer to a problem in principle. Given a problem, the machine could run for a while and come to a stop, when it will have produced an answer, or else it could run forever.

Turing proved a theorem that says there is no general algorithm (a logical step by step procedure) which will determine if a Turing machine working on an arbitrary input is going to finish or run forever.

The Turing machine is a classical clockwork machine. What if there is another kind of machine? Enters the quantum computer. David Deutsch, theoretical physicist at Oxford University, thinks reality can in principle be simulated provided the universal machine is a quantum computer. And so does Gerald Milburn, Professor of Theoretical Physics, University of Queensland Australia, a key scientist in the effort to make a quantum computer, who has written an excellent book to tell us why.

A quantum computer can do things a classical computer cannot do. To simulate a system of N particles moving randomly, it would take a time that scales as NN, ie, exponential in the size of the system. For 10 particles, the ultra-computer working at 1 teraflop will take about three years just to compute the first time step. A quantum computer, on the other hand, will produce an arbitrarily accurate simulation of a quantum physical system. Similarly, the fastest computer will need billions of years to find the prime numbers, that multiplied together, result in a number containing 400 digits; whereas a quantum computer could finish the job in a year.

To see how quantum computing differs from classical computing, we need to understand the fundamental difference between the randomness of an ordinary coin-toss and that of a quantum coin-toss. And here is where Milburn's exposition is admirable. This book really rewards the diligent reader with critical understanding, unlike too many popular science books that obfuscate with over-simplification.

The classical probability of coming up head (H) or tail (T) in a single coin-toss is 0.5. If you toss the coin twice, there are four possible outcomes: HH, TT, HT and TH, the probability of each result being 0.5 x 0.5, or 0.25.

The quantum equivalent of a coin-toss is a light beam striking a half-silvered mirror, or beam splitter, where half of it is transmitted and half reflected. When the intensity of light is reduce sufficiently, single photons (irreducible quanta of light) strike the beam splitter, one at a time. Photon detectors placed in the path of the transmitted and the reflected photons will show that approximately half of the photons are transmitted (T) and half of them reflected (R). If instead of the photon detector, a fully reflecting mirror is placed in the path of the reflected and transmitted light respectively, the beam can be sent through a second beam splitter. This is equivalent to a second coin-toss. So, in analogy to the classical coin-toss, there are four possible paths for a photon, TT, RR, TR and RT. In the arrangement shown in the Figure, paths TT and RR will end up in the upper (U) detector, whereas TR and RT will end up in the lower (L) detector. So, just as in the classical coin- toss, the U and L detectors will each detect half of the photons.

However, if we have certain knowledge that the photon is reflected or transmitted after the first beam splitter, ie, if we observe, then the number of photons registered by the U or L detectors will no longer be half. That is the first sign of quantum strangeness.

So far, we have been treating the light beam as if it were a stream of particles, which it is not. Light is simultaneously both wave and particle. This becomes evident as the relative light paths in the upper and lower half of the figure is altered, so that the light waves can interfere destructively with each other. It can be arranged that no light reaches the U detector, or the paths can be adjusted so that U receives say 20% of the light and L 80% of the light. But when the intensity of the light beam is reduced so that only single photons strike the beam-splitter at a time, all the photons will be registered by the U detector, or the U detector and L detector will register 20% and 80% of the photons respectively. It is as if the individual photon can still interfere with itself, as though it were a wave.

This strange behaviour of light can be perfectly described by considering probability amplitudes instead of probabilities, and probability amplitudes change when unobserved, indistinguishable alternatives become distinguishable. And this can lead to paradoxical situations such as quantum 'seeing in the dark', or getting information about something without light ever reaching it.

Probability amplitudes give probabilities when squared, and the rule for combining them was discovered by quantum physicist Richard Feynman. Feyman's rule says that if an event can happen in two or more indistinguishable ways, the probability amplitude for that event is the 'sum' of the probability amplitudes for each way considered separately. The final probability of an event is then obtained by the sum of the squares of the two numbers describing the resultant probability amplitude.

Feynman's rule is Pythagora's theorem: a2 + b2 = c2, which tells us how to obtain the length of the hypotenuse of a right-angled triangle from the lengths of the two sides. We learned that in elementary Euclidean geometry in school. It seems that Euclidean geometry enters fundamentally into quantum reality. But why should that be? "Nobody knows how it can be like that," said Feynman.

An ordinary coin-toss provides one bit of information, yes or no. Its quantum counterpart, however, provides anything from one bit to infinity, depending on how many indistinguishable possibilities are generated by beam splitters placed in the path of the photon. To capture this difference, Bill Schumaker coined a new word, qubit. "A qubit is infinity in a coin toss."

Now, add quantum entanglement, the correlation between subsystems in a state of quantum superposition, to Feynman's rule and one comes up with still stranger stuff, the e-bit, or information transfer through the entangled state, the possibilities of quantum crytography, teleportation (beam me up Scotty), and quantum computing.

The popular parable of the entangled state is Schrödinger's cat, which is in superposition of being both dead and alive at the same time. In fact, Feynman's rule already describes the superposition of indistinguishable alternatives, ie, the entangled state.

Quantum computing depends above all, on the coherent entangled state, or pure state that contains the superposition of multiple, even mutually exclusive alternatives. The more alternatives are entangled, the faster the quantum computing. It is the ability to ask many questions all at once, rather than one question at a time.

The theory of quantum computing is well advanced, but no quantum computer has yet been built. The Department of Defence is supporting a lot of work in this area, perhaps as part of star-wars weaponry. Different bits of hardware are used to create entangled states. These include single ions trapped in a strong electric field, atoms trapped in tiny optical cavities, and nuclear magnetic resonance to create superposition of spins of atomic nuclei in organic molecules such as chloroform. One major problem is decoherence, or loss of coherent superposition, which would make the computer stop working.

Are quantum physicists looking in all the right places? I have proposed, some time ago, that quantum coherence is the basis of living organisation (see "Science and Ethics in a New Key", this issue). The coherence of organisms is actively maintained, and extends, in the ideal, over all space-time scales. Could the organism be the model of the quantum computer that quantum physicists are trying to build? Could it be that proteins in the body fold to perfection in split seconds because the process involves quantum computing via infinitely many entangled states that encompass the entire body?

Can a quantum computer simulate reality perfectly? Milburn asserts that "the physical world is a quantum world", which makes "a quantum computer not only possible, but inevitable." I agree only in the sense that the organism may already be a kind of quantum computer.

Milburn goes further, he says it may take decades or perhaps a century, but "a commercially viable quantum computer is a certainty." I am not so sure of that.

Certainly, a quantum computer could solve what a classical computer cannot solve. But Milburn and others believe that not only will the quantum computer be able to simulate reality, it will be part of the fabric of reality. That should send chills up and down our spine.

Will a quantum hyper-computer take over the world? Will it simulate a human being so exactly that it is a hyper-intelligent human being? Well, if it starts to laugh at jokes I'd be worried. And if it can really simulate a human being perfectly, we better start setting a good example. Otherwise it has every chance of turning out to be a power-hungry despot intent on enslaving the whole world.

Genetic Risk Assessment without Genes

Methods for Genetic Risk Assessment, Edited by David J Brusick, Lewis Publishers CRS Press, 1994, ISBN 1-56670-039-6. Angela Ryan reviews

This book results from a joint program between the United Nations Environmental Program (UNEP) and the International Commission for Protection against Environmental Mutagens and Carcinogens (ICPEMC) some years ago. But in view of the lack of publication in this area since, we must assume that the volume is still taken seriously by regulators.

US Environmental Protection Agency (EPA) scientists dominate the list of contributors. The rest are from Europe, and one, from The Radiation Effects Research Foundation in Hiroshima City, Japan.

This highly technical book about genetic risk assessment risks disqualifying itself as such by not mentioning the mutagenic potential and other harmful effects of transgenic nucleic acid (rDNA and rRNA) even once. (Readers who want to know more about this vital topic should consult Slipping Through the Regulatory Net: 'Naked' and 'Free' Nucleic Acids, by Mae-Wan Ho, Angela Ryan, Joe Cummins & Terje Traavik, Third World Network Biotechnology Series, Penang, 2001.) This is a major omission, particularly as the book states from the outset,

"Chemical substances and their by-products are being released into the environment, on a worldwide basis, at increasing levels. It is estimated that up to 90% of these agents have not been adequately evaluated for their mutagenic activity toward somatic or germ lines of mammalian species. Because the pool of genes that will form all future generations of species is held by the existing individuals, it is essential that their exposure to mutagens be minimized."

The environment of the industrialised world now contains innumerable industrial substances that have never been evaluated for genotoxic (ie, mutagenic) effects. Moreover, the contributors suggest, industry and government would have us remain ignorant of the damage these agents cause. And above all, who will pay for the research? Very few investigations have been carried out, and this trend is set to continue.

The book appeals to developing nations not to repeat the same mistakes that now beset the Northern Hemisphere. It recom-mends that third world countries expand pre-market testing programs and employ regulatory action in order to reduce exposures to mutagens. UNEP expresses a firm commitment to providing information already available on chemicals, as well as providing guidance for the implementation of programs for testing and regulation.

Unfortunately, too many toxic industries have already migrated to poor countries in the Third World as regulation tightens up in the north. It would have helped for the scientists involved to call for tougher international regulatory regimes to prevent that from happening.

Risk assessment is explained as a complex function of "perception" and "cultural factors", "economic" and "risk-risk trade offs", and the "knowledge and education of the people conducting the risk assessment". Furthermore, important stress is made to consider these factors early in the risk-assessment process and not after release when it is too late.

All that said, the book is a valuable asset to any regulatory personnel. It is highly technical yet fully accessible, providing clear explanations at every stage. The scientists are meticulous in covering the ground. The references in some of the chapters extend beyond 20 pages.

It is divided into five main sections; hazard identification, assessment of exposures, methods of assessment, risk characterisation and monitoring.

Identification of the full range of hazards is an enormous task and this book suggests the most immediate and simple strategy is to prevent the introduction of new hazardous chemicals. Screening new substances plays the most important role in limiting exposure to genotoxic agents, motivating industry to withdraw from using them.

Chemicals already released from established industries require a greater burden of proof in order to be recalled requiring huge investments. Moreover, there are "high political costs" associated with "remedial action" and the book repeatedly expresses hopelessness in this respect. It suggests other strategies. Existing hazards in the environment should be identified so as to prevent further releases in other parts of the world, and epidemiological data will also help to prevent the ongoing spread of hazardous agents. But these strategies will not be effective unless the scientists and the institutions concerned also make a firm stand to change the existing burden of proof and for remedial action.

Genotoxicity tests are readily available that effectively and efficiently identify the vast majority of mutagens, including single chemicals and complex mixtures. The universality of DNA is the basis on which any agent detected as a mutagen in one organism can be considered a mutagen to all other organisms, including humans. Nevertheless, there are differences between mice and men in terms of size/dose relationship and level of DNA repair activity and different organs differ in levels of susceptibility.

Methods for assessing risk reveal the complexities of mutagenesis in vivo. Chemical mutagens act in a stage-specific way over time and this has important implications for risk assessment. There are many methods including pharmacokinetic modelling, identification of molecular damage and somatic cell mutagenesis assays. Short-term assays provide rapid yes-no answers and testing systems range from free DNA through prokaryotes to eukaryotes and from isolated cells in vitro to intact animals.

The most conclusive data obtained are considered to be those involving in vivo mammalian mutagenicity assays, but these are slow and very expensive and cannot keep pace with the rapidly expanding group of chemicals with potential mutagenic properties. Therefore it suggests short-term microbial mutagenicity assays are a "forced" compromise. However, the book also promotes the use of in vivo mutagenicity assays based on transgenic mice, particularly when a strong burden of proof is needed.

The value of animal experiments is highly questionable in terms of animal welfare as much as scientific validity (see "Royal Society soft selling GM animals" and "Animal experiments worse than useless", ISIS News 9/10, February 2001). Again, there is no substitute for setting the burden of proof in line with the precautionary principle (see "Use and abuse of the precautionary principle", ISIS News 6, September 2000).

The probability of "hazard" is a function of the sources of the genotoxic substance, its occurrence, concentration and bioavailability in contact media. The pertinent issues being the presence and formation of genotoxicants in environmental media, and the contacts of humans and biota with these media.

The most alarming aspect of "contact media" is the multifactorial nature of exposure. Industrial chemicals are released on a daily basis into indoor and outdoor air, water, soils, foods, medicines and consumer products and the role of transport and transformation processes between all of these amounts to a chemical concoction of nightmarish proportions.

Exposure pathways for humans are ingestion, inhalation and dermal uptake. I was horrified to learn that my bathroom is probably the most dangerous place in my home. The pores of the skin are wide open in a steaming bathroom where multitudes of potential genotoxic agents in bathroom products are rapidly absorbed through the skin. This is especially sinister considering genetic damage to epithelial and endothelial cells are responsible for the most common cause of cancer in humans.

The authors emphasise that the chemical and physical properties of genotoxic agents, in terms of their ability to react with DNA, also give these agents an ability to react with RNA, proteins and other molecules present in the cell. Therefore these agents induce other toxic effects in addition to genetic damage.

There is evidence to suggest genetic damage contribute to ageing and cardiovascular disease. It also affects fertility. In fact, the range of effects is vast as all organ systems and biological processes are subject to somatic genetic disorders at any time from conception to old age.

Moreover, many diseases such as diabetes, psychosis and cardiovascular disease are thought to result from interactions between multiple genes and environmental factors in ways that are not yet fully understood.

The book recommends hazard identification should include prioritizing agents according to their prevalence in the environment. Monitoring is a vital part of hazard identification, as complex mixtures of thousands of different chemicals can be found in the environment. These can be separated out using chemical fractionation for testing.

Various methods for quantifying risk are described along with recommendations for comparing and ranking risks. Emissions from urban, rural, industrial or energy-related activities should be ranked in terms of risk to human populations. The ranking of geographical areas and regions should also be done for the purpose of intervention and control of high levels of genotoxicants.

Monitoring strategies for identifying and tracking these agents demands a tremendous amount of work. For example, approximately 2800 compounds have been identified in ambient air, but information on genotoxic activity is available for less that 11% of these.

The reader is reminded that the human species is dependent on the domesticated plants and animals of agriculture and on certain wild animals such as fish. All organisms are mutually interdependent through complex food chains and biogeochemical cycles of the local and global ecosystems in which they live. Any alteration in ecosystem structure, function and stability from genotoxic agents will have adverse effects.

Interestingly, the examples of ecological adverse effects used by this book are "the emergence of pesticide, herbicide or antibiotic resistance and changes in virulence and host range of pathogens". It states, "The genes controlling these phenotypes may be propagated "horizontally" from organism to organism and across "species boundaries". The author here alludes to a 1987 America Association for the Advancement of Science (AAAS) report from a selected symposium on biotechnology titled, The Infectious Spread of Engineered Genes, to evidence the point that genes do spread horizontally in the environment.

One major shortcoming of this book is that it leaves the non-chemist reader blind as to the exact identity of the 10% of chemicals, which have been evaluated. Three broad classes of chemicals are discussed in detail, each representing hundreds of different molecules.

Nevertheless, the chemicals to avoid are vinyl chloride and vinyl bromide, which are used mainly for polymer production, 2-Nitropropane a volatile industrial solvent, and ethylene and ethylene oxide, both of which are important industrial chemicals.

Finally, I re-iterate my main criticism of this otherwise important book in the permissive stance it adopts towards industry. Instead of insisting on the precautionary principle and requiring industry to prove products and processes safe before they are approved, the scientists chose to emphasise monitoring after the event, almost in the hope of finding body counts large enough just to force remedial action.

This is all the more serious considering the risks of rDNA molecules entering an environment rich in genotoxicants that will induce further rounds of mutation in the rDNA molecules as they transfer horizontally and recombine out of control.

Economic & Cultural Exclusion in a Fragile Earth Won't Go

The Little Earth Book by James Bruge, Alistair Sawday Publishing, 2000, ISBN 1-90197023-X. Nick Papadimitriou reviews.

Recent events in New York will alert us once again to the incredible fragility of this world of ours. We are living within a closed system that renders economic and cultural exclusion a potentially lethal combination. A book that challenges the politics of division and dominance is reviewed here.

The pocket sized Little Earth Book is a compressed encyclopaedia of our world, its degradation by the forces mentioned, and what we can do to prevent this. Concision is the key, and Bruge certainly manages that. In less than 150 pages he takes us on a world tour of the complex issues that need to be taken up and challenged if we want this home of ours, and us on it, to survive. The author aims to put the forces of change in the hands of creative individuals rather than leaving global problem solving to experts. I can imagine a few copies of this book found their way into the united rucksacks of Genoa.

The book is divided up into small, two or three page sections, each devoted to an aspect of the interwoven systems that shape our existence. As well as resource issues to do with water, food and the corporate take-over of same, and the human world of economics and trade, Little Earth Book encourages us to reappraise the world-view that runs at the heart of the shambles we are making of this planet. Thus, the assumed primacy of economic growth is challenged in a superb section based on the arguments of Frederick Soddy, Nobel-winning atomic scientist who applied the second law of thermodynamics to economics. Soddy concluded that science in the service of economics would inexorably result in environmental degradation and political conflict. Other sections of the book look at alternative frameworks for thinking about wealth, having, in passing, challenged the notion of GDP as a measure of "wealth." The alternatives include Citizens' Incomes, and the issuing of local scrip notes.

Occasionally, Bruge comes across as a little naïve: The CO2 tax imposed on the developed world to counter the enormity of its "ecological footprint" and carbon trading could be seen as placing less developed countries more firmly in an economically dependant relationship with the former.

Other sections serve as a primer for understanding the function of the World Trade Organisation (WTO), Third World Debt, Free trade (tellingly subtitled "winners and losers"), and the role of banks.

The topical issue of genetic engineering appears - somewhat falteringly drawn from our own Dr Ho - as well as organic farming, pollution, ecology and armaments are all included. Bruge is well clued-up on the way biotechnology is increasingly posing as the new "environmental" approach. Gathering up patents to ancient herbal treatments; the emergent language of bioremediation and the chilling quote from Monsanto's CEO that they are dealing with "primary needs: food, shelter and clothing" show how biotech giants are projecting an image the precise opposite of what they are.

Overall, Little Earth Book is worth reading because it dwells on solutions as well as problems. It is a democratic little manual, treating the reader as an intelligent and responsible world citizen, fully equipped to reach her or his own conclusions.

Each section of the book contains facts highlighted in green if they are deemed sufficiently crucial, surprising or shocking (frequently all three). At the tail-end of each section, the reader is directed to further books - and not always ones that accord entirely with Bruge's take on things.

Incidentally, we picked up our copy of Little Earth Book at the local charity shop. In fact, we found about eight of them there. It is curiously appropriate and yet sadly indicative that this challenging book should've found its way to one of the few places in the UK where recycling values are visibly active.

I-SIS News 11 index

Article first published October 2001