The Omnipresent Binary Code
Megabyte, gigabyte, terabyte - these may be familiar terms for many researchers when it comes to storing their experimental data. But what are petabytes, quantum computers, motes and nodes? Possibly the electronic building blocks that could revolutionize science. They are already in use, the electronic helpers that, according to experts, will initiate a similar change in the scientific world view as the development of the scientific experiment itself - in Norwegian ice fields, American soil and soon also on the bottom of the sea. And that is just the beginning. Computer experts are whispering that it will soon be possible: the all-encompassing measurement of the world, 24 hours a day, seven days a week, year in, year out.
This dream is to be realized with billions of small computers that use sensors to store data and forward it via links to their numerous accomplices - a network of miniature spies that, for the first time, is sending data in real time to the laboratories of the world on a large scale could. Motes, nodes or pods are the names of the tiny creatures on which the hopes of many scientists rest. Ecologist Katalin Szlavecz from Johns Hopkins University in B altimore now also relies on the intelligent sensors. Szlavecz researches biodiversity and nutrient cycles in the soil - a complex undertaking that has so far been made more difficult by the limited amount of data: Only samples that have been collected by hand can be examined.
Wireless sensor networks could revolutionize soil ecology research
(Katalin Szlavecz) Since last September, however, the required data has no longer been measured by students, but by ten electronic sensors that have been installed in the ground on the edge of the university campus. They record the temperature and humidity of the surrounding soil every minute and regularly send the data to Szlavecz's office. The researcher is enthusiastic: "Wireless sensor networks could revolutionize research into soil ecology," she says [1].
At the moment, however, the development of wireless sensors is still in its infancy. The minicomputers are simply too expensive to be used efficiently and on a large scale, explains Deborah Estrin, Director of the Center for Embedded Networked Sensing in Los Angeles.
The applicability of the networks also leaves a lot to be desired: Szlavencz needed a whole team of computer experts and programmers to adapt their sensors to the respective needs. Kris Pister, pioneer of network sensors and founder of the company Dust Networks, also finds a central shortcoming. Nevertheless, he firmly believes that network sensors could be the scientific assistants of the future - after all, they create the basis for huge databases that could then be queried by scientists from all over the world.
Data and masses of data
Gaetano Borriello, a computer specialist at the University of Washington in Seattle, is convinced that the collection of such amounts of data could also fundamentally change scientific work [1]. Instead of conducting their own experiments, in the future researchers could simply comb through existing databases. Hypothesis and proof would be mere hours apart.
But how should all this information be stored? According to Alexander Szalay from Johns Hopkins University in B altimore and Jim Gray from Microsoft Research in San Francisco, California [2], the amount of data available is already doubling every year. So far, even terabytes (a terabyte is 1000 gigabytes) of information have posed major challenges for data processing. But soon, the two scientists predict, projects like the Large Synoptic Survey Telescope, which aims to create three-dimensional maps of space, will produce even larger amounts of data: possibly several petabytes per year. One petabyte is equivalent to the text of approximately one billion books.
The big challenge here is not only the creation of large databases; these have been part of scientific work for years. Rather, the problem is the use of the masses of data. Due to the virtual nature of the stored data, it is becoming increasingly difficult to grasp connections and recognize structures. The binary code lacks visuality, and not only the computer expert Stephen Muggleton from the Center for Integrative Systems Biology at Imperial College in London fears that the constant increase in data could lead to increasing difficulties in understanding [3].
Databases: standard without standards
The current databases are not only lacking in tactile and visual user-friendliness, formally speaking there are still some difficulties with the all-encompassing data access and data exchange. Because although more and more experiments are integrating computers as an essential component and research groups are increasingly using the data of other experts for their work, there are still no standards in the use of databases.
If you're lucky, only certain formats in the foreign masses of data have to be changed in order to connect them to your own materials. It becomes more difficult when different vocabulary makes it difficult to compare the experiments or important boundary conditions of the experiment are not listed in the database or are not listed in a way that is easy to understand. Szalay and Gray therefore call for fixed standards for the configuration of databases [2].
But even this does not protect research from two of the central problems of database use: the poor verifiability of entered data and the aging of the systems. Szalay and Gray point out that even today it is difficult to list all the components that go into an experimental setup. In the case of electronic databases, however, there is also the fact that they are constantly evolving. Tomorrow's technology might no longer be able to access today's technology, so today's experiments could no longer be repeated if nobody was able to operate the outdated devices and programs.
Programmers as scientists of the future
According to Szalay and Gray, such obstacles can only be overcome with strict and formal data management and the appropriate training of scientists and students. Already, says Ian Foster, director of the Computation Institute at the University of Chicago, the computer is assuming the place in science that mathematics once had: that of an underlying foundation [4]. The laboratory of tomorrow must also take this fact into account: with the creation of interdisciplinary institutes and the permanent involvement of programmers and computer specialists in experiment and theory.
From the Limits of Language to the Limits of Binary Code
For some researchers, the formal language of programmers is no longer just a means to store data; Roger Bent from the Molecular Sciences Institute in Berkeley and his colleague Jehoshua Bruck from the California Institute of Technology, for example, see the possibility of revolutionizing biology in the strict regularity of programming languages [5].
Particularly since the decoding of the genome and the study of how it works and how it reacts to environmental conditions, they explain, biology is faced with the difficulty of depicting these complex relationships. The formal language of the program codes can provide assistance here. It is time to leave the limits of descriptive language and, like physics, to arrive at the regularity of binary code: how life works, represented in tags and if loops.
While the interaction of zeros and ones is still something new for biology, some areas of physics have already broken new ground. The magic word is: quantum computers. Although the idea of the quantum computer is almost as "Image" as that of today's calculating machines, the abstract discipline has only started moving in the last few years. No wonder, since quantum mechanics and quantum theory are among the most complicated areas of physics today. alt="
Unlike binary computers, quantum computers not only store information in analog form, i.e. in the classic values zero and one, but also in so-called qbits. These are quantum-mechanical two-level systems that can function analogously on the one hand, but can also react with other qubits on the other. Calculation steps in a quantum computer therefore do not have to be carried out successively, but could, following the complex rules of quantum theory, run in parallel and thus considerably faster.
An operational quantum computer in 2020 seems realistic to me
(Andrew Steane) Quantum computers would have an advantage over today's binary computers, especially for very complex arithmetic operations and large amounts of data; and because of its own construction, it also seems better able to calculate the quantum nature of molecules and materials. However, quantum computers are not yet ready for use. The technical hurdles have so far been too great. Nevertheless, the quantum physicists are certain of their success: "An operational quantum computer in the year 2020 seems realistic to me," explains Andrew Steane, a member of the quantum computer group at Oxford University [6].
The quantum computer – the computer of the future? Hardly likely. It is simply too special for general use, explains Isaac Chuang from the Massachusetts Institute of Technology in Cambridge. But even if the quantum computer will only be a niche product in 2020 and the binary code will continue to be part of everyday life - one thing already seems certain today: it will no longer be enough for the researcher of tomorrow to be just a biologist or physicist; According to the experts, the ideal image of the future is a researcher who combines three professions at once – the scientist, the technician and the programmer. A new animale universalis, so to speak.