Pub Date : 2009-10-01Epub Date: 2009-08-06DOI: 10.2976/1.3175813
Mikhail Prokopenko, Daniel Polani, Matthew Chadwick
We consider a simple information-theoretic model for evolutionary dynamics approaching the "coding threshold," where the capacity to symbolically represent nucleic acid sequences emerges in response to a change in environmental conditions. We study the conditions when a coupling between the dynamics of a "proto-cell" and its proto-symbolic representation becomes beneficial in terms of preserving the proto-cell's information in a noisy environment. In particular, we are interested in understanding the behavior at the "error threshold" level, which, in our case, turns out to be a whole "error interval." The useful coupling is accompanied by self-organization of internal processing, i.e., an increase in complexity within the evolving system. Second, we study whether and how different proto-cells can stigmergically share such information via a joint encoding, even if they have slightly different individual dynamics. Implications for the emergence of biological genetic code are discussed.
{"title":"Stigmergic gene transfer and emergence of universal coding.","authors":"Mikhail Prokopenko, Daniel Polani, Matthew Chadwick","doi":"10.2976/1.3175813","DOIUrl":"https://doi.org/10.2976/1.3175813","url":null,"abstract":"<p><p>We consider a simple information-theoretic model for evolutionary dynamics approaching the \"coding threshold,\" where the capacity to symbolically represent nucleic acid sequences emerges in response to a change in environmental conditions. We study the conditions when a coupling between the dynamics of a \"proto-cell\" and its proto-symbolic representation becomes beneficial in terms of preserving the proto-cell's information in a noisy environment. In particular, we are interested in understanding the behavior at the \"error threshold\" level, which, in our case, turns out to be a whole \"error interval.\" The useful coupling is accompanied by self-organization of internal processing, i.e., an increase in complexity within the evolving system. Second, we study whether and how different proto-cells can stigmergically share such information via a joint encoding, even if they have slightly different individual dynamics. Implications for the emergence of biological genetic code are discussed.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 5","pages":"317-27"},"PeriodicalIF":0.0,"publicationDate":"2009-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3175813","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28892722","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-10-01Epub Date: 2009-10-26DOI: 10.2976/1.3240502
Joschka Boedecker, Oliver Obst, N Michael Mayer, Minoru Asada
Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks. Networks in RC have a sparsely and randomly connected fixed hidden layer, and only output connections are trained. RC networks have recently received increased attention as a mathematical model for generic neural microcircuits to investigate and explain computations in neocortical columns. Applied to specific tasks, their fixed random connectivity, however, leads to significant variation in performance. Few problem-specific optimization procedures are known, which would be important for engineering applications, but also in order to understand how networks in biology are shaped to be optimally adapted to requirements of their environment. We study a general network initialization method using permutation matrices and derive a new unsupervised learning rule based on intrinsic plasticity (IP). The IP-based learning uses only local learning, and its aim is to improve network performance in a self-organized way. Using three different benchmarks, we show that networks with permutation matrices for the reservoir connectivity have much more persistent memory than the other methods but are also able to perform highly nonlinear mappings. We also show that IP-based on sigmoid transfer functions is limited concerning the output distributions that can be achieved.
{"title":"Initialization and self-organized optimization of recurrent neural network connectivity.","authors":"Joschka Boedecker, Oliver Obst, N Michael Mayer, Minoru Asada","doi":"10.2976/1.3240502","DOIUrl":"https://doi.org/10.2976/1.3240502","url":null,"abstract":"<p><p>Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks. Networks in RC have a sparsely and randomly connected fixed hidden layer, and only output connections are trained. RC networks have recently received increased attention as a mathematical model for generic neural microcircuits to investigate and explain computations in neocortical columns. Applied to specific tasks, their fixed random connectivity, however, leads to significant variation in performance. Few problem-specific optimization procedures are known, which would be important for engineering applications, but also in order to understand how networks in biology are shaped to be optimally adapted to requirements of their environment. We study a general network initialization method using permutation matrices and derive a new unsupervised learning rule based on intrinsic plasticity (IP). The IP-based learning uses only local learning, and its aim is to improve network performance in a self-organized way. Using three different benchmarks, we show that networks with permutation matrices for the reservoir connectivity have much more persistent memory than the other methods but are also able to perform highly nonlinear mappings. We also show that IP-based on sigmoid transfer functions is limited concerning the output distributions that can be achieved.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 5","pages":"340-9"},"PeriodicalIF":0.0,"publicationDate":"2009-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3240502","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28892725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-10-01Epub Date: 2009-10-22DOI: 10.2976/1.3218766
Elsebeth Kolmos, Monika Nowak, Maria Werner, Katrin Fischer, Guenter Schwarz, Sarah Mathews, Heiko Schoof, Ferenc Nagy, Janusz M Bujnicki, Seth J Davis
The circadian clock is a timekeeping mechanism that enables anticipation of daily environmental changes. In the plant Arabidopsis thaliana, the circadian system is a multiloop series of interlocked transcription-translation feedbacks. Several genes have been arranged in these oscillation loops, but the position of the core-clock gene ELF4 in this network was previously undetermined. ELF4 lacks sequence similarity to known domains, and functional homologs have not yet been identified. Here we show that ELF4 is functionally conserved within a subclade of related sequences, and forms an alpha-helical homodimer with a likely electrostatic interface that could be structurally modeled. We support this hypothesis by expression analysis of new elf4 hypomorphic alleles. These weak mutants were found to have expression level phenotypes of both morning and evening clock genes, implicating multiple entry points of ELF4 within the multiloop network. This could be mathematically modeled. Furthermore, morning-expression defects were particular to some elf4 alleles, suggesting predominant ELF4 action just preceding dawn. We provide a new hypothesis about ELF4 in the oscillator-it acts as a homodimer to integrate two arms of the circadian clock.
{"title":"Integrating ELF4 into the circadian system through combined structural and functional studies.","authors":"Elsebeth Kolmos, Monika Nowak, Maria Werner, Katrin Fischer, Guenter Schwarz, Sarah Mathews, Heiko Schoof, Ferenc Nagy, Janusz M Bujnicki, Seth J Davis","doi":"10.2976/1.3218766","DOIUrl":"https://doi.org/10.2976/1.3218766","url":null,"abstract":"<p><p>The circadian clock is a timekeeping mechanism that enables anticipation of daily environmental changes. In the plant Arabidopsis thaliana, the circadian system is a multiloop series of interlocked transcription-translation feedbacks. Several genes have been arranged in these oscillation loops, but the position of the core-clock gene ELF4 in this network was previously undetermined. ELF4 lacks sequence similarity to known domains, and functional homologs have not yet been identified. Here we show that ELF4 is functionally conserved within a subclade of related sequences, and forms an alpha-helical homodimer with a likely electrostatic interface that could be structurally modeled. We support this hypothesis by expression analysis of new elf4 hypomorphic alleles. These weak mutants were found to have expression level phenotypes of both morning and evening clock genes, implicating multiple entry points of ELF4 within the multiloop network. This could be mathematically modeled. Furthermore, morning-expression defects were particular to some elf4 alleles, suggesting predominant ELF4 action just preceding dawn. We provide a new hypothesis about ELF4 in the oscillator-it acts as a homodimer to integrate two arms of the circadian clock.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 5","pages":"350-66"},"PeriodicalIF":0.0,"publicationDate":"2009-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3218766","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28892726","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-10-01Epub Date: 2009-09-08DOI: 10.2976/1.3171566
Daniel Polani
In biology, the exception is mostly the rule, and the rule is mostly the exception. However, recent results indicate that known universal concepts in biology such as the genetic code or the utilization of ATP as a source of energy may be complemented by a large class of principles based on Shannon's concept of information. The present position paper discusses various promising pathways toward the formulation of such generic informational principles and their relevance for the realm of biology.
在生物学中,例外大多是规则,规则大多是例外。然而,最近的研究结果表明,生物学中已知的普遍概念,如遗传密码或利用 ATP 作为能量来源,可能会得到基于香农信息概念的一大类原则的补充。本立场文件讨论了制定此类通用信息原则的各种可行途径及其与生物学领域的相关性。
{"title":"Information: currency of life?","authors":"Daniel Polani","doi":"10.2976/1.3171566","DOIUrl":"10.2976/1.3171566","url":null,"abstract":"<p><p>In biology, the exception is mostly the rule, and the rule is mostly the exception. However, recent results indicate that known universal concepts in biology such as the genetic code or the utilization of ATP as a source of energy may be complemented by a large class of principles based on Shannon's concept of information. The present position paper discusses various promising pathways toward the formulation of such generic informational principles and their relevance for the realm of biology.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 5","pages":"307-16"},"PeriodicalIF":0.0,"publicationDate":"2009-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2801531/pdf/HJFOA5-000003-000307_1.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28892721","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-10-01Epub Date: 2009-10-07DOI: 10.1080/19552068.2009.9635816
Mikhail Prokopenko
Typically, self-organization is defined as the evolution of a system into an organized form in the absence of external pressures. A broad definition of self-organization is given by Haken (2006). “A system is self-organizing if it acquires a spatial, temporal, or functional structure without specific interference from the outside. By ‘specific’ we mean that the structure or functioning is not impressed on the system but that the system is acted upon from the outside in a nonspecific fashion. For instance, the fluid which forms hexagons is heated from below in an entirely uniform fashion and it acquires its specific structure by self-organization.” Another definition is offered by Camazine et al. (2001) in the context of pattern formation in biological systems. “Self-organization is a process in which pattern at the global level of a system emerges solely from numerous interactions among the lower-level components of the system. Moreover, the rules specifying interactions among the system’s components are executed using only local information, without reference to the global pattern.” These definitions capture three important aspects of self-organization. First, it is assumed that the system has many interacting components and advances from a less organized state to a more organized state dynamically over some time, while exchanging energy, matter, and/or information with the environment. Second, this organization is manifested via global coordination and the global behavior of the system is a result of the interactions among the agents. In other words, the global pattern is not imposed on the system by an external ordering influence (Bonabeau et al., 1997). Finally, the components, whose properties and behaviors are defined prior to the organization itself, have only local information and do not have knowledge of the global state of the system—therefore, the process of self-organization involves some local information transfer (Polani, 2003; Lizier et al., 2008). Self-organization may seem to contradict the second law of thermodynamics that captures the tendency of systems to disorder. The “paradox” was explained in terms of multiple coupled levels of dynamic activity within the Kugler–Turvey model (Kugler and Turvey, 1987): self-organization and loss of entropy occurs at the macrolevel while the system dynamics on the micro-level (which serves as an entropy “sink”) generates increasing disorder. Kauffman (2000) suggested that the underlying principle of selforganization is the generation of constraints in the release of energy. According to this view, the constrained release allows for such energy to be controlled and channeled to perform some useful work. This work in turn can be used to build better and more efficient constraints for the release of further energy and so on. Adding and controlling constraints on self-organization opens a way to guide it in a specific way. In general, one may consider different ways to guide the process (dynamic
{"title":"Guided self-organization.","authors":"Mikhail Prokopenko","doi":"10.1080/19552068.2009.9635816","DOIUrl":"https://doi.org/10.1080/19552068.2009.9635816","url":null,"abstract":"Typically, self-organization is defined as the evolution of a system into an organized form in the absence of external pressures. A broad definition of self-organization is given by Haken (2006). “A system is self-organizing if it acquires a spatial, temporal, or functional structure without specific interference from the outside. By ‘specific’ we mean that the structure or functioning is not impressed on the system but that the system is acted upon from the outside in a nonspecific fashion. For instance, the fluid which forms hexagons is heated from below in an entirely uniform fashion and it acquires its specific structure by self-organization.” Another definition is offered by Camazine et al. (2001) in the context of pattern formation in biological systems. “Self-organization is a process in which pattern at the global level of a system emerges solely from numerous interactions among the lower-level components of the system. Moreover, the rules specifying interactions among the system’s components are executed using only local information, without reference to the global pattern.” These definitions capture three important aspects of self-organization. First, it is assumed that the system has many interacting components and advances from a less organized state to a more organized state dynamically over some time, while exchanging energy, matter, and/or information with the environment. Second, this organization is manifested via global coordination and the global behavior of the system is a result of the interactions among the agents. In other words, the global pattern is not imposed on the system by an external ordering influence (Bonabeau et al., 1997). Finally, the components, whose properties and behaviors are defined prior to the organization itself, have only local information and do not have knowledge of the global state of the system—therefore, the process of self-organization involves some local information transfer (Polani, 2003; Lizier et al., 2008). Self-organization may seem to contradict the second law of thermodynamics that captures the tendency of systems to disorder. The “paradox” was explained in terms of multiple coupled levels of dynamic activity within the Kugler–Turvey model (Kugler and Turvey, 1987): self-organization and loss of entropy occurs at the macrolevel while the system dynamics on the micro-level (which serves as an entropy “sink”) generates increasing disorder. Kauffman (2000) suggested that the underlying principle of selforganization is the generation of constraints in the release of energy. According to this view, the constrained release allows for such energy to be controlled and channeled to perform some useful work. This work in turn can be used to build better and more efficient constraints for the release of further energy and so on. Adding and controlling constraints on self-organization opens a way to guide it in a specific way. In general, one may consider different ways to guide the process (dynamic","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 5","pages":"287-9"},"PeriodicalIF":0.0,"publicationDate":"2009-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/19552068.2009.9635816","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28633342","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-10-01Epub Date: 2009-10-19DOI: 10.2976/1.3167215
Oliver Ratmann, Carsten Wiuf, John W Pinney
The evolutionary mechanisms by which protein interaction networks grow and change are beginning to be appreciated as a major factor shaping their present-day structures and properties. Starting with a consideration of the biases and errors inherent in our current views of these networks, we discuss the dangers of constructing evolutionary arguments from naïve analyses of network topology. We argue that progress in understanding the processes of network evolution is only possible when hypotheses are formulated as plausible evolutionary models and compared against the observed data within the framework of probabilistic modeling. The value of such models is expected to be greatly enhanced as they incorporate more of the details of the biophysical properties of interacting proteins, gene phylogeny, and measurement error and as more advanced methodologies emerge for model comparison and the inference of ancestral network states.
{"title":"From evidence to inference: probing the evolution of protein interaction networks.","authors":"Oliver Ratmann, Carsten Wiuf, John W Pinney","doi":"10.2976/1.3167215","DOIUrl":"https://doi.org/10.2976/1.3167215","url":null,"abstract":"<p><p>The evolutionary mechanisms by which protein interaction networks grow and change are beginning to be appreciated as a major factor shaping their present-day structures and properties. Starting with a consideration of the biases and errors inherent in our current views of these networks, we discuss the dangers of constructing evolutionary arguments from naïve analyses of network topology. We argue that progress in understanding the processes of network evolution is only possible when hypotheses are formulated as plausible evolutionary models and compared against the observed data within the framework of probabilistic modeling. The value of such models is expected to be greatly enhanced as they incorporate more of the details of the biophysical properties of interacting proteins, gene phylogeny, and measurement error and as more advanced methodologies emerge for model comparison and the inference of ancestral network states.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 5","pages":"290-306"},"PeriodicalIF":0.0,"publicationDate":"2009-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3167215","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28892723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
STDP (spike-timing-dependent synaptic plasticity) is thought to be a synaptic learning rule that embeds spike-timing information into a specific pattern of synaptic strengths in neuronal circuits, resulting in a memory. STDP consists of bidirectional long-term changes in synaptic strengths. This process includes long-term potentiation and long-term depression, which are dependent on the timing of presynaptic and postsynaptic spikings. In this review, we focus on computational aspects of signaling mechanisms that induce and maintain STDP as a key step toward the definition of a general synaptic learning rule. In addition, we discuss the temporal and spatial aspects of STDP, and the requirement of a homeostatic mechanism of STDP in vivo.
STDP (spike- time -dependent synaptic plasticity)被认为是一种突触学习规则,它将spike-timing信息嵌入到神经元回路中特定的突触强度模式中,从而产生记忆。STDP包括突触强度的双向长期变化。这个过程包括长期增强和长期抑制,这取决于突触前和突触后脉冲的时间。在这篇综述中,我们将重点放在诱导和维持STDP的信号机制的计算方面,作为定义一般突触学习规则的关键一步。此外,我们还讨论了STDP的时间和空间方面,以及体内STDP的稳态机制的要求。
{"title":"Experimental and computational aspects of signaling mechanisms of spike-timing-dependent plasticity.","authors":"Hidetoshi Urakubo, Minoru Honda, Keiko Tanaka, Shinya Kuroda","doi":"10.2976/1.3137602","DOIUrl":"https://doi.org/10.2976/1.3137602","url":null,"abstract":"<p><p>STDP (spike-timing-dependent synaptic plasticity) is thought to be a synaptic learning rule that embeds spike-timing information into a specific pattern of synaptic strengths in neuronal circuits, resulting in a memory. STDP consists of bidirectional long-term changes in synaptic strengths. This process includes long-term potentiation and long-term depression, which are dependent on the timing of presynaptic and postsynaptic spikings. In this review, we focus on computational aspects of signaling mechanisms that induce and maintain STDP as a key step toward the definition of a general synaptic learning rule. In addition, we discuss the temporal and spatial aspects of STDP, and the requirement of a homeostatic mechanism of STDP in vivo.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 4","pages":"240-54"},"PeriodicalIF":0.0,"publicationDate":"2009-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3137602","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28682484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-08-01Epub Date: 2009-05-22DOI: 10.2976/1.3132954
Alexandre Specht, Frédéric Bolze, Ziad Omran, Jean-François Nicoud, Maurice Goeldner
Light-responsive biologically active compounds offer the possibility to study the dynamics of biological processes. Phototriggers and photoswitches have been designed, providing the capability to rapidly cause the initiation of wide range of dynamic biological phenomena. We will discuss, in this article, recent developments in the field of light-triggered chemical tools, specially how two-photon excitation, "caged" fluorophores, and the photoregulation of protein activities in combination with time-resolved x-ray techniques should break new grounds in the understanding of dynamic biological processes.
{"title":"Photochemical tools to study dynamic biological processes.","authors":"Alexandre Specht, Frédéric Bolze, Ziad Omran, Jean-François Nicoud, Maurice Goeldner","doi":"10.2976/1.3132954","DOIUrl":"10.2976/1.3132954","url":null,"abstract":"<p><p>Light-responsive biologically active compounds offer the possibility to study the dynamics of biological processes. Phototriggers and photoswitches have been designed, providing the capability to rapidly cause the initiation of wide range of dynamic biological phenomena. We will discuss, in this article, recent developments in the field of light-triggered chemical tools, specially how two-photon excitation, \"caged\" fluorophores, and the photoregulation of protein activities in combination with time-resolved x-ray techniques should break new grounds in the understanding of dynamic biological processes.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 4","pages":"255-64"},"PeriodicalIF":0.0,"publicationDate":"2009-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3132954","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28682485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-08-01Epub Date: 2009-08-06DOI: 10.2976/1.3175812
G Wayne Brodland, Justina Yang, Jen Sweny
Although previous studies suggested that the interfacial tension gamma(cc) acting along cell-cell boundaries and the effective viscosity mu of the cell cytoplasm could be measured by compressing a spherical aggregate of cells between parallel plates, the mechanical understanding necessary to extract this information from these tests-tests that have provided the surface tension sigma(cm) acting along cell-medium interfaces-has been lacking. These tensions can produce net forces at the subcellular level and give rise to cell motions and tissue reorganization, the rates of which are regulated by mu. Here, a three-dimensional (3D) cell-based finite element model provides insight into the mechanics of the compression test, where these same forces are at work, and leads to quantitative relationships from which the effective viscosity mu of the cell cytoplasm, the tension gamma(cc) that acts along internal cell-cell interfaces and the surface tension sigma(cp) along the cell-platen boundaries can be determined from force-time curves and aggregate profiles. Tests on 5-day embryonic chick mesencephalon, neural retina, liver, and heart aggregates show that all of these properties vary significantly with cell type, except gamma(cc), which is remarkably constant. These properties are crucial for understanding cell rearrangement and tissue self-organization in contexts that include embryogenesis, cancer metastases, and tissue engineering.
{"title":"Cellular interfacial and surface tensions determined from aggregate compression tests using a finite element model.","authors":"G Wayne Brodland, Justina Yang, Jen Sweny","doi":"10.2976/1.3175812","DOIUrl":"https://doi.org/10.2976/1.3175812","url":null,"abstract":"<p><p>Although previous studies suggested that the interfacial tension gamma(cc) acting along cell-cell boundaries and the effective viscosity mu of the cell cytoplasm could be measured by compressing a spherical aggregate of cells between parallel plates, the mechanical understanding necessary to extract this information from these tests-tests that have provided the surface tension sigma(cm) acting along cell-medium interfaces-has been lacking. These tensions can produce net forces at the subcellular level and give rise to cell motions and tissue reorganization, the rates of which are regulated by mu. Here, a three-dimensional (3D) cell-based finite element model provides insight into the mechanics of the compression test, where these same forces are at work, and leads to quantitative relationships from which the effective viscosity mu of the cell cytoplasm, the tension gamma(cc) that acts along internal cell-cell interfaces and the surface tension sigma(cp) along the cell-platen boundaries can be determined from force-time curves and aggregate profiles. Tests on 5-day embryonic chick mesencephalon, neural retina, liver, and heart aggregates show that all of these properties vary significantly with cell type, except gamma(cc), which is remarkably constant. These properties are crucial for understanding cell rearrangement and tissue self-organization in contexts that include embryogenesis, cancer metastases, and tissue engineering.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 4","pages":"273-81"},"PeriodicalIF":0.0,"publicationDate":"2009-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3175812","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28633340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2009-08-01Epub Date: 2009-07-24DOI: 10.2976/1.3185785
N S Gov
Collective motion of cell cultures is a process of great interest, as it occurs during morphogenesis, wound healing, and tumor metastasis. During these processes cell cultures move due to the traction forces induced by the individual cells on the surrounding matrix. A recent study [Trepat, et al. (2009). Nat. Phys. 5, 426-430] measured for the first time the traction forces driving collective cell migration and found that they arise throughout the cell culture. The leading 5-10 rows of cell do play a major role in directing the motion of the rest of the culture by having a distinct outwards traction. Fluctuations in the traction forces are an order of magnitude larger than the resultant directional traction at the culture edge and, furthermore, have an exponential distribution. Such exponential distributions are observed for the sizes of adhesion domains within cells, the traction forces produced by single cells, and even in nonbiological nonequilibrium systems, such as sheared granular materials. We discuss these observations and their implications for our understanding of cellular flows within a continuous culture.
细胞培养的集体运动是一个非常有趣的过程,因为它发生在形态发生、伤口愈合和肿瘤转移过程中。在这些过程中,由于单个细胞在周围基质上诱导的牵引力,细胞培养物移动。最近一项研究[Trepat, et al.(2009)]。[Nat. Phys. 5, 426-430]首次测量了驱动集体细胞迁移的牵引力,并发现它们在整个细胞培养过程中出现。前面的5-10行细胞通过具有明显的向外牵引力,在指导其余培养物的运动方面发挥了重要作用。牵引力的波动比在培养边缘产生的定向牵引力大一个数量级,而且具有指数分布。这种指数分布在细胞内粘附域的大小,单个细胞产生的牵引力,甚至在非生物非平衡系统,如剪切颗粒材料中都可以观察到。我们将讨论这些观察结果及其对我们理解连续培养中的细胞流动的影响。
{"title":"Traction forces during collective cell motion.","authors":"N S Gov","doi":"10.2976/1.3185785","DOIUrl":"https://doi.org/10.2976/1.3185785","url":null,"abstract":"<p><p>Collective motion of cell cultures is a process of great interest, as it occurs during morphogenesis, wound healing, and tumor metastasis. During these processes cell cultures move due to the traction forces induced by the individual cells on the surrounding matrix. A recent study [Trepat, et al. (2009). Nat. Phys. 5, 426-430] measured for the first time the traction forces driving collective cell migration and found that they arise throughout the cell culture. The leading 5-10 rows of cell do play a major role in directing the motion of the rest of the culture by having a distinct outwards traction. Fluctuations in the traction forces are an order of magnitude larger than the resultant directional traction at the culture edge and, furthermore, have an exponential distribution. Such exponential distributions are observed for the sizes of adhesion domains within cells, the traction forces produced by single cells, and even in nonbiological nonequilibrium systems, such as sheared granular materials. We discuss these observations and their implications for our understanding of cellular flows within a continuous culture.</p>","PeriodicalId":55056,"journal":{"name":"Hfsp Journal","volume":"3 4","pages":"223-7"},"PeriodicalIF":0.0,"publicationDate":"2009-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.2976/1.3185785","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"28682481","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}