Navigation auf uzh.ch
Individual humans make decisions all the time. These decisions often involve integrating a variety of contextual cues to ensure a decision is made that is adequate to the circumstances. The wealth of information required to make decisions is provided by our senses. They perceive unique aspects of our environment, such as visual and auditory information, which our brain subsequently integrates into a holistic percept. This is called multisensory – or multimodal – perception.
Single cells are no different than humans in this regard. They constantly make important decisions, such as whether to divide or not. Researchers at the University of Zurich (UZH) therefore extended the concept of contextual, multimodal perception found in humans to individual cells. And surprisingly, they found that single cells make decisions much more autonomously than previously thought. “Adequate decision-making by individual cells uses multimodal perception, allowing cells to integrate outside signals like growth factors with information from inside the cell, such as the number of cellular organelles,” says Lucas Pelkmans, professor at the Department of Molecular Life Sciences at UZH.
In certain situations, such inside cues can overrule the outside stimuli: e.g. in tumors, where the actual state of particular cells overrides the treatment with anti-proliferative drugs, thus making them treatment-resistant. “Such resistance to drugs is a major problem in the fight against cancer. The solution may come from taking into account the contextual cues that individual cells experience and ultimately altering them,” Pelkmans says.
To test if cells decide according to contextual, multimodal perception as humans do, the researchers had to concurrently measure the activity of multiple signaling nodes – the cells’ outside sensors – as well as several potential cues from inside the cell, like the local environment and the number of cellular organelles. All that had to be analyzed in single cells as well as across millions of cells. “To do this, we used ‘4i’, a method developed at UZH, which allows us to simultaneously visualize and quantify up to 80 different proteins and protein modifications in single cells using fluorescence microscopy,” says Bernhard Kramer, first author of the study.
The researchers found that the variability in the activities of individual sensors across cells is tightly linked to variation in internal cues. For example, the abundance of mitochondria, the cells’ power stations, fundamentally affects how an external stimulus is perceived by a single cell. Furthermore, each sensor integrates different cues from inside the cell. When the researchers evaluated an important decision of a single cell – namely to proliferate or to stay quiescent upon a growth stimulus – they found that the cell’s choice was mediated by the perception of multiple sensors and was predictably modulated by cues of the cell’s internal state.
“For any specific decision of a cell, all outside signals and internal cues have to be viewed in concert. Single cells are thus able to make adequate context-dependent decisions – and are therefore clearly smarter than previously thought,” says PhD candidate Kramer.
Bernhard A. Kramer, Jacobo S. Del Castillo, Lucas Pelkmans. Multimodal perception links cellular state to decision making in single cells. Science. July 14, 2022. DOI: 10.1126/science.abf4062