Countless biological processes underpin the behaviors, physiology, and existence of living organisms, involving intricate communication between cells and other molecular constituents. These molecular entities employ diverse means to transmit information, such as diffusion, electrical depolarization, and the exchange of mechanical waves.
A recent investigation conducted by researchers at Yale University focused on quantifying the energy expended in the transmission of information between cells and molecular elements. Their findings, detailed in a paper published in Physical Review Letters, unveil a novel tool that holds promise for probing cellular networks and gaining deeper insights into their functioning.
Benjamin B. Machta, one of the researchers behind the study, explained, “We’ve contemplated this project in various forms for some time. My initial discussions on the topic, which eventually evolved into this project, date back to a decade ago when I spoke with my Ph.D. advisor, Jim Sethna. However, it took some time to materialize. Sam and I began discussing it while contemplating the energy costs required for biological computations, a theme central to much of his Ph.D. work. He eventually devised the means to carry out these computations.”
Machta and his colleague, Samuel J. Bryant, drew inspiration from earlier research published in the late 1990s, particularly the efforts of Simon Laughlin and his collaborators. Back then, this group had endeavored to empirically determine the energy expenditure of neurons when transmitting information.
“Laughlin and his team found that this energy expenditure ranged from 104 to 107 kBT/bit, depending on specific details, significantly surpassing the ‘fundamental’ bound of approximately kBT/bit, often referred to as the Landauer bound, which represents the minimum energy required to erase a bit of information,” Machta elucidated. “We were curious whether this indicated inefficiency in biology or if there were other expenses to consider. Notably, the Landauer bound doesn’t account for geometry or physical intricacies. Applying it is nuanced because it’s only applicable to reversible information erasure without incurring any computational cost—although that’s not our primary focus here.”
Another objective of Machta and Bryant’s recent study was to ascertain whether optimizing these energetic costs could offer insights into why molecular systems employ distinct physical mechanisms for communication in different contexts. For example, while neurons predominantly communicate through electrical signals, other cellular entities utilize chemical diffusion for communication.
“We aimed to discern the energy-efficient regime for each mechanism, among others, concerning energy cost per bit,” Machta stated. “In all our calculations, we considered information transmission through a physical channel, from a physical sender of information (such as an ‘sending’ ion channel that opens and closes to transmit a signal) to a receiver (e.g., a voltage detector in the membrane, which could also be an ion channel). The core of our calculation rests on the information rate through a Gaussian channel, albeit with some novel aspects.”
Notably, their estimations always took into account a physical channel through which physical particles and electrical charges flowed according to cellular physics. Moreover, they assumed that the channel was affected by thermal noise within the cellular environment.
“We can compute the spectrum of this noise using the ‘fluctuation dissipation theorem,’ which relates the spectrum of thermal fluctuations to the near-equilibrium response functions,” Machta elaborated.
A distinctive feature of their estimations was the use of relatively simple models, allowing them to establish conservative lower bounds on the energy necessary to drive a channel and carry physical currents in a biological system.
“Since the signal must overcome thermal noise, we typically find costs with a geometric factor multiplying ‘kBT/bit,'” Machta explained. “This geometric factor can be related to the size of the sender and receiver; a larger sender usually reduces the cost per bit by distributing a dissipative current over a larger area. Similarly, a larger receiver enables more averaging of thermal fluctuations, ensuring that a weaker overall signal can still convey the same information.”
For example, in the context of electrical signaling, they derived a cost per bit formula scaling as r²/σI σO kBT/bit, where r represents the distance between the sender and receiver, and σI and σO denote the sizes of the sender and receiver. Importantly, for ion channels, which are only a few nanometers wide but transmit information across microns, this cost could be orders of magnitude greater than the kT/bit suggested by simpler or more fundamental arguments as a lower bound.
In summary, Machta and his colleagues’ calculations validate the substantial energetic expenses involved in information transfer between cells. These estimations might serve as a starting point for elucidating the high cost of information processing observed in experimental studies.
Machta noted, “Our explanation is less ‘fundamental’ than the Landauer bound, as it relies on the geometry of neurons and ion channels, among other details. However, if biology adheres to these intricacies, it suggests that systems like neurons may be operating efficiently within real information and energy constraints, rather than being inherently inefficient. These calculations are insufficient to declare any particular system efficient, but they do suggest that transmitting information through space can incur substantial energy costs.”
Looking ahead, the recent work by Machta and his colleagues could inform further biological investigations. Their paper introduces a ‘phase diagram’ that delineates scenarios where the selective adoption of specific communication strategies (e.g., electrical signaling, chemical diffusion) proves optimal. This diagram may facilitate a deeper understanding of the design principles underlying diverse cell signaling strategies, shedding light on why neurons employ chemical diffusion at synapses but rely on electrical signals for long-distance information transmission, and why E. coli bacteria employ diffusion to communicate their chemical environment.
Machta concluded, “One avenue we are currently exploring is applying this framework to comprehend the energetics of a concrete signal transduction system. Our recent work focused on the abstract cost of information transmission between two single components, whereas real systems typically involve information processing networks. Applying our bounds necessitates an understanding of information flow within these networks, which presents its own set of technical challenges, especially when applied to specific geometries (such as a ‘spherical’ neuron or an axon resembling a tube, each distinct from the infinite plane we considered).”