For better experience, turn on JavaScript!


Amazing How Only Drop of Information Creates New Life
Newborn baby and woman

Amazing How Only Drop of Information Creates New Life

Today, we are swamped with massive amounts of information from the internet, news, social media all around us.

Perhaps we have some idea about what information is but haven't thought so much about it. We routinely search and consume information to solve problems and perhaps look for a recipe for an amazing breakfast.

Without data and information extracted from it, we are blindfolded and cannot do much of anything. Access to data is A and O to everybody today - big or small.

A few decades ago, when I was still in graduate school every day on the way home, we were having intense and in-depth discussions with my colleague about the nature of information.

We were wondering among others about why nature has labeled some information as classified and thereby denying us access.

Men walking and discussing the nature of information
Men walking and discussing the nature of information

At the time our interest for the information itself was mainly academic sort. As scientists, we were just curious about how nature works. We were wondering if nature fundamentally is made of pure information and everything else we see around us derives from it.

However, it didn’t escape our minds that perhaps the knowledge of the nature of information will benefit us somehow.

Maybe we could predict the future, make computer algorithms more efficient or convert information to energy.

On the flip side, we were also discussing information having a mind of its own, or maybe the universal evolutionary processes found a unique path to evolve that will eventually make humans extinct?

We were asking ourselves, will artificial intelligence (AI) eventually take over the world? Discussions were a long time ago, and we have learned a lot more since that time.

Later I ended studying molecular biology and genetics, and afterward, I understood the surprising connections between information and life. I explain these in this article.

Information Rules the World Using 3 Reliable Disguises

Albert Einstein’s famous equation of mass energy equivalence unfortunately lead to the invention of a nuclear bomb, but also more beneficial and peaceful development of nuclear power plants.

It turns out that creation of information or decrease of entropy (disorder) requires energy and increase of entropy means that information gets more dispersed and harder to catch, but luckily we get paid by an equivalent amount of energy in return.

Without this energy, we would not even exist.

So, there is an absolute equivalence of three forms of nature: energy, matter, and entropy. It is like a holy trinity.

Three forms of nature
Three forms of nature

We cannot destroy any of them. It is possible to only convert from one form to another, and the total amount always remains constant. Somehow the three are all the same.

The equivalence has exciting consequences. Among others, we can measure the mass of information.

The amount of information is equivalent to a change in entropy; Thus, increased entropy means decreased information and similarly decreased entropy means that we gain some information.

One intriguing thought is to lose weight by forgetting something irrelevant, but unfortunately, the amount is so minuscule that it is not a practical way to slim down.

How to Measure and Make Information Tangible?

In the 1940s when telecommunication technology still was at its infancy, one of the main problems was how to transfer messages over long distances.

The signal had to be amplified after every few miles to be detectable, but this resulted in the amplification of all the noise, which made it hard or impossible to read the message over long distances.

Claude Shannon, a mathematician who was working at the Bell Labs at the time, solved the problem by first suggesting the use of binary code, i.e., 0’s and 1’s instead of an analogue signal to send the messages.

Image of Claude Shannon
Claude Shannon
Image of Claude Shannon
Claude Shannon

Today, we use the binary code in virtually all communications; Thus, to say that Shannon’s suggestion was successful in an understatement.

Shannon’s second idea and his definition of information entropy entitled him the label “father of information theory.” His grand insight was to discard the meaning of information from the amount of information.

He measured information by the amount of how surprised one would be to see a message. For example, if we watch yesterday’s news today, we would hardly be surprised and thus not gaining any information.

You can find detailed formulations and corresponding equations in the tutorial “Introduction to Information Theory and Its Applications to DNA and Protein Sequence Alignments.” If equations are your thing, you can also take a peek at the article “How to calculate the weight of information.”

Shannon chose to call the fundamental unit of information “bit” a short form of “binary digit.”

It is confusing that we also measure the amount of data in computers using the unit “bit,” but this computer bit is not equivalent to the unit of information although the name is the same.

In a computer one ‘byte’ is represented by a string of symbols, say ‘01001101’, and consists of eight bits.

Image of a hand touching a vague wall
Although information is an elusive concept, it is tangible.
mage of a hand touching a vague wall
Although information is an elusive concept, it is tangible.

So, each zero or one in a computer is a bit but can contain less information than one bit as measured using Shannon’s unit of information. For example, data in a computer may consist of random bits; Thus, the entropy is very high and therefore to obtain information from it is very hard.

Can We Make Energy from Information?

Because information is the change in entropy, an entropy increase means decreasing information.

Therefore, by burning coal, gas, oil, and other fuels we increase the entropy and get useful energy in return, and the increase in entropy means a decreased amount of information; Thus, the simple answer is yes we can get energy from information, but can we take pure information and convert it to energy?

In other words, can we convert information to energy without using conventional means of transferring energy, no heating or moving molecules, but somehow use information itself as a medium?

Is it Possible for Computer Algorithm Make Energy?

The idea to use information itself to generate work dates back in time when the definition of information was still unknown. In 1867, James Maxwell described a thought experiment I which he could make one of two isolated compartments to heat up and the other one to cool down, just by having information of the speed of the molecules.

Maxwell’s idea seemed to violate the second law of thermodynamics, which dictates that it is not possible to create order without any energy input.

It was like making a freezer and an oven to work next to each other utilizing pure information alone. The puzzle wasn’t solved until over a hundred years later by Rolf Landauer.

When solving Maxwell’s Demon puzzle, Landauer also proposed a theoretical limit for energy consumption of computation, named Landauer’s principle.

The remarkable consequence of his principle is that it is theoretically possible to make computers to cool down instead of heating.

Read more about Maxwell’s demon and Landauer’s principle in this article.

Recently, Cina Aghamohammadi and James P. Crutchfield conducted studies of thermodynamics of random number generation algorithms and showed that in theory true random number generators could both generate random numbers and simultaneously convert thermal energy to work [1].

Yes, we know the laws of physics and that we cannot get energy from nothing, but as we know now information is not nothing and can thus be converted to energy.

It was Leó Szilárd who in 1929 established equivalence between information and energy.

Japanese Make the First Practical Information-to-Heat-Engine in the World

Shoichi Toyabe and his colleagues at the Chuo University in Tokyo were inspired by Maxwell’s Demon idea and put it to work in reality. In their 2010 paper, they demonstrated in practice how they could convert pure information to energy [2].

energy staircase
Toyabe and his colleagues' experiment is similar to a staircase where each step up illustrates higher energy. A particle in random motion mostly falls downward but occasionally also up, in which case an observer sets a barrier that prevents the particle from falling again.
energy staircase
Toyabe and his colleagues' experiment is similar to a staircase where each step up illustrates higher energy. A particle in random motion mostly falls downward but occasionally also up, in which case an observer sets a barrier that prevents the particle from falling again.

Their experiment consisted of small polystyrene beads rotating in an electrical field. Most of the time the beads rotate in the direction of the field, but now and then some of them rotate against the electrical field.

Toyabe and colleagues used a video camera to observe the direction of the bead's rotation and every time a bead rotated against the electrical field they inverted the direction of the field. This way they were able to gradually increase the potential of the bead using information alone.

The setup can be compared with a staircase where particles randomly move up and down, and every time a particle moves up a barrier prevents it to fall back.

With this setup, Toyabe and colleagues were able to achieve 28% information to energy conversion efficiency.

However, this experiment used a video camera and other equipment to make the experiment successful; Thus, the conversion factor 28% they achieved is an idealized quantity. Nevertheless, this is likely a start that leads to interesting practical applications in the future.

Nanoscale and quantum scale processes are very different from what we are used to observing in our macroscopic world. For example, computer algorithms are used in quantum computers to transfer heat among the qubits (a quantum bit) to initialize a ‘regular’ quantum computing processes.

What is the Mass of Pure Information?

We know now that energy, mass, and information are the same in different forms.

Therefore, by knowing Landauer’s principle, which states the minimum amount of energy that must be consumed when one bit is erased we can calculate the mass of information.

For this we need to make two assumptions; First, that each bit in a computer contains the maximal amount of information thereby being equivalent to one bit of Shannon’s information, and secondly that to create one bit costs the same amount of energy as erasing it.

According to Landauer’s principle, we can then establish that one bit contains about 10-21 Joule energy.

Using Einstein’s famous formula of energy-mass equivalence, e=mc2, we can calculate the mass of one bit of information, which is about 10-38 kg. A mind-blowing minuscule amount.

How much information we need to make life?

Human in geotank
Human in geotank

Craig Venter and colleagues longstanding pursuit of establishing the simplest possible form of life resulted in the first synthetic cell in 2010.

Six years later in 2016 the team published a synthetic cell named syn3.0, which is entirely different to what we can find in nature.

Syn3.0 contains only 473 genes, and the total length of the genome is 531,000 base pairs. That is 531,000 letters of A, T, C, and G.

Although, not found in nature; this organism is the smallest known life form. In comparison, a human genome contains about 20,000 genes (up to about 30,000).

So what’s the information content of this mini life form? Each DNA base pair can carry maximally two bits of information; Thus, the total amount of information in this minimalist genome is 531,000 x 2 = 1,062,000 bits or about one megabit, which is in tune of a single decent size image stored in a computer.

The human genome is about three billion bases long, but we inherit genomes from both our parents thus a complete human has six billion bases altogether; Thus, to conceive a human baby takes maximally only 1.2x10-16 milligram of information, all packaged into the chromosomes we inherit from our parents - not even a weight of a single grain of sand!

To make a plant requires more information than to make you

Wheat
Close-up photography of wheat plant. Photo by Karina Vorozheeva on Unsplash
Wheat
Close-up photography of wheat plant. Photo by Karina Vorozheeva on Unsplash

The size of the human genome is about three billion bases, but the bread-wheat genome is about five times of that.

On the other hand, the soybean genome is only about 1.1 billion bases long but contains an estimated 64,000 genes.

So, in comparison, it takes about six Gigabits of information to code a human and 32 Gigabits to code bread-wheat.

It is intriguing why many plants need more information than a human which we in general regard a more complex organism. They indeed aren’t smarter than us?

Perhaps the answer lies in the unique path the evolution has taken in regards to humans. Bread-wheat has no brain, but humans do and can hopefully use that fact to collect more information.

It looks like the great information revolution isn’t human-made, but a path taken and directed by natural evolution. More about this towards the end of this article.

What is the Mass of information making up all humankind

Let's first consider an individual. The complete blueprint of how to make an individual human being is encoded into his or her genome, which size is about six billion letters or base pairs, consisting of A, G, C, and T, totalling in about two meters of length.

Each cell in our bodies contains a complete copy of the genome, which we inherited from our parents. It is not possible to count the exact number of cells in a human body, but a reasonable estimate is about 30 trillion, that is 30,000,000,000,000. Currently, a more significant figure than the US government debt.

Therefore, we all have nearly 30 trillion times 6 billion base pairs in our body, directing the functions from the brain to all the different tissues, totalling in 180 billion letters and if stretched out the total length could go around the Earth 1.5 million times.

The amount of information is genuinely enormous and does have the physical weight we could measure on a scale.

However, disregarding the physical dimensions of the DNA, we arrive at about 2⋅10−10 milligrams of pure information per individual.

Extrapolated to the population on the Earth estimated to be about seven billion, the total weight of information making all the humankind is only about one milligram or about a grain of sand!

Thanks to Einstein we know that mass and energy are equivalent and that mass consist of an enormous amount of energy.

Given the massive amount of energy as matter, perhaps we should not be so surprised that the total information weight of all the humankind is only about one milligram.

At least if converted to energy it would not destroy an entire large city. Hopefully, we use the information wisely.

Can nuclear explosion make valuable information disappear?

Nuclear weapons test
A nuclear weapons test in the Pacific.
Nuclear weapons test
A nuclear weapons test in the Pacific.

The atomic bomb named Little Boy dropped in Hiroshima in 1945 that destroyed the whole city converted only about 0.8 grams of the matter to energy. Later, the Fat Man dropped in Nagasaki turned only about 1 gram of its mass to energy.

The most massive thermonuclear bomb ever made was the Russian Tsar Bomba, the equivalent of 50 megatons of TNT converted the whole 2.3 kg of matter to energy and in the process nearly killed the pilot who dropped the bomb.

This Tsar Bomba increased entropy by about 1.528 Terabits. However, as we already noted, this explosion converted mass to energy, and thus no information was destroyed in the process.

Sure, the entropy increased a lot, and because information is the change in entropy, the amount of information decreased, but it was not destroyed, only converted to another form. The concept is mindboggling, but that is so as far we know physics today.

Will AI Make Us Absolutely Extinct?

Organisms are continually competing about space and evolving novel niches to survive. We can find bacteria almost everywhere, and even an average human has about 1.5 kg of them.

Bacteria have evolved to be able to feed on diverse energy sources. Cyanobacteria, for example, use sunlight, and other types can consume a wide variety of organic or inorganic compounds even oil and plastic.

When energy is available, they can rapidly divide and make large colonies. E. coli can double their colony size within every 20 minutes; Therefore, they can spread fast when an energy source is available.

Bacteria are also evolving and becoming resistant to many know antibiotics thereby presenting a direct threat to our health.

Besides the diversity of bacteria, life on earth consists of an astonishing diversity from the deepest darkness of ocean bottoms up to the sky.

Extremophiles living half a mile under the ice of Antarctica, and the ones that are happy to live inside atomic reactors.

Given this enormous diversity of life already evolved, it is hard to imagine a new kind yet. Perhaps we cannot see the forest because the trees are blocking the view?

We saw that plant genomes can be several times larger and contain many more genes than the human genome; Thus, at the birth, they contain more information than a human being.

The big difference though is that plants don’t have brains that can collect and organize information. Many animals do have a brain, but humans are the masters of using it, at least for the time being.

It looks like the great information revolution isn’t man-made but a path taken and directed by natural evolution.

Because of the brain humans can evolve during their lifetime by collecting information and rapidly surpass the amount of information contained in any plant.

The information storage and processing device, the brain, has enabled the tiny amount of information at the birth of humans to grow collectively to what we call the information society.

Let’s look at the next step in the evolution. Various technologies have provided means, and the amounts are exponentially growing every second.

It wasn’t so long time ago when the standard machine learning methods were superior compared to artificial neural networks, but because of the humungous amount of data, big data, the tables have now turned.

In many cases, if not the most cases, artificial neural nets are today superior compared to human devised approaches of data analysis and that is all thanks to the massive amount of data available today that didn’t exist when old machine learning methods still were the king.

Can we see the forest? Let’s follow the money trail, in this case consisting of information.

Within all the remarkable diversity of life on earth, I can see one niche in which evolution has not finished her experiments yet.

We know that data and information extracted from it are essential for the survival of any company and also for nation states. The one who has the right information has the edge.

artificial intelligence (AI) is already in extensive use and predicted to take over most of the jobs soon. However, the lost employment opportunities aren’t the point here; the point is that AI is increasingly utilized because it is a necessity to maintain a competitive edge of businesses; Thus, the spread of AI is likely to be unstoppable.

An estimated 2.5 quintillion bytes of data are generated each day, and AI is deemed to be a superior method of analysis.

However, most of the time we have no idea of the details of these analyses and we are left with a magic black box that at least for the time being serves us well.

Woman wearing jacket
Woman wearing jacket

On another front engineering technology is also developing exponentially, such as robot designs, data storage devices, cars, and computers that perhaps soon are based on quantum technology.

Not surprisingly, we can expect AI to steer all these 'gizmos,' but not controlled by AI. However, we are continually feeding AI with an ever-increasing amount of data that it uses to feed itself and turn it to information. A novel source of food.

It certainly looks like the natural evolution has found a novel, yet the untested path to try out.

Several prominent people Stephen Hawking, Elon Musk, Bill Gates, and Tim Berners-Lee have expressed concerns about AI going beyond human control.

It is interesting that human beings made up from such a tiny mass of information can give rise to information amounting several times the information weight of all humans on earth.

The next step may be a whole new concept of life – AI.


References

Data Never Sleeps 5.0 domo.com

Shannon C.E. (1948). "A mathematical theory of communication." Bell Syst. Tech. J. 27, 379–423, 623–656. Google Scholar

Szilard L. (1929). "Über die Entropieverminderung in einem thermodynamischen System bei Eingrien intelligenter Wesen." Z. Phys. 53, 840–856. doi:10.1007/BF01341281.

Fisher R.A. (1935) "The logic of inductive inference." J. R. Stat. Soc. 98, 39–82. doi:10.2307/2342435.

Kullback S. "Information theory and statistics." In Wiley 1959 New York, NY:Wiley Google Scholar

Shannon C.E., Weaver W. (1962). "The mathematical theory of communication." In The University of Illinois Press 1962 Urbana, IL:The University of Illinois Press Google Scholar

Jaynes E.T. (2003). "Probability theory. The logic of science." In Cambridge University Press 2003 Cambridge, UK:Cambridge University Press Google Scholar

Karnani M., Pääkkönen K., Annila A. (2009). "The physical character of information." Proc. R. Soc. A. 465 (2107): 2155–75. doi:10.1098/rspa.2009.0063.

Cargill Gilston Knott (1911). "Quote from undated letter from Maxwell to Tait." Life and Scientific Work of Peter Guthrie Tait. Cambridge University Press. pp. 213–215.

Rolf Landauer (1961). "Irreversibility and heat generation in the computing process." IBM Journal of Research and Development, 5 (3): 183–191, doi:10.1147/rd.53.0183

Lynn J. Rothschild & Rocco L. Mancinelli (2001). "Life in extreme environments." Nature volume 409, pages 1092–1101 (22 February 2001). doi:10.1038/35059215

Hutchison, Clyde A., III; Chuang, Ray-Yuan; Noskov, Vladimir N.; Assad-Garcia, Nacyra; Deerinck, Thomas J.; Ellisman, Mark H.; Gill, John; Kannan, Krishna; Karas, Bogumil J.; Ma, Li; Pelletier, James F.; Qi, Zhi-Qing; Richter, R. Alexander; Strychalski, Elizabeth A.; Sun, Lijie; Suzuki, Yo; Tsvetanova, Billyana; Wise, Kim S.; Smith, Hamilton O.; Glass, John I.; Merryman, Chuck; Gibson, Daniel G.; Venter, J. Craig (2016). "Design and synthesis of a minimal bacterial genome." Science (Washington, DC, United States) (2016), 351(6280), 1414 CODEN: SCIEAS; ISSN: 0036-8075. CAS

Gibson DG, Glass JI, Lartigue C, Noskov VN, Chuang RY, Algire MA, Benders GA, Montague MG, Ma L, Moodie MM, Merryman C, Vashee S, Krishnakumar R, Assad-Garcia N, Andrews-Pfannkoch C, Denisova EA, Young L, Qi ZQ, Segall-Shapiro TH, Calvey CH, Parmar PP, Hutchison CA 3rd, Smith HO, Venter JC. (2010). "Creation of a bacterial cell controlled by a chemically synthesized genome." Science. 2010 Jul 2;329(5987):52-6. doi: 10.1126/science.1190719. Epub 2010 May 20. PubMed

Antoine Bérut, Artak Arakelyan, Artyom Petrosyan, Sergio Ciliberto, Raoul Dillenschneider & Eric Lutz (2012). "Experimental verification of Landauer’s principle linking information and thermodynamics." Nature volume 483, pages 187–189 (08 March 2012).

Cina Aghamohammadi and James P. Crutchfield (2017). "Thermodynamics of Random Number Generation." Santa Fe Institute Working Paper 2017-01-02 arxiv.org:1612.08459 csc.ucdavis.edu

Boykin, P. Oscar; Mor, Tal; Roychowdhury, Vwani; Vatan, Farrokh; Vrijen, Rutger (2002). "Algorithmic cooling and scalable NMR quantum computers." Proceedings of the National Academy of Sciences. 99 (6): 3388–3393. doi:10.1073/pnas.241641898

Shoichi Toyabe, Takahiro Sagawa, Masahito Ueda, Eiro Muneyuki & Masaki Sano (2010). "Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality." Nature Physics volume 6, pages 988–992. doi:10.1038/NPHYS1821