Results 1 - 10
of
200
Modelling and analysis of gene regulatory networks,
- Nat Rev Mol Cell Biol
, 2008
"... The genome encodes thousands of genes whose pro ducts enable cell survival and numerous cellular func tions. The amounts and the temporal pattern in which these products appear in the cell are crucial to the pro cesses of life. Gene regulatory networks govern the levels of these gene products. A ge ..."
Abstract
-
Cited by 118 (2 self)
- Add to MetaCart
The genome encodes thousands of genes whose pro ducts enable cell survival and numerous cellular func tions. The amounts and the temporal pattern in which these products appear in the cell are crucial to the pro cesses of life. Gene regulatory networks govern the levels of these gene products. A gene regulatory net work is the collection of molecular species and their inter actions, which together control geneproduct abundance. Numerous cellular processes are affected by regulatory networks. Innovations in experimental methods have ena bled largescale studies of gene regulatory networks and can reveal the mechanisms that underlie them. Consequently, biologists must come to grips with extremely complex networks and must analyse and integrate great quantities of experimental data. Essential to this challenge are computational tools, which can answer various questions: what is the full range of behaviours that this system exhibits under different conditions? What changes are expected in the dynamics of the system if certain parts stop functioning? How robust is the system under extreme conditions? Various computational models have been developed for regulatory network analysis. These models can be roughly divided into three classes. The first class, logi cal models, describes regulatory networks qualitatively. They allow users to obtain a basic understanding of the different functionalities of a given network under dif ferent conditions. Their qualitative nature makes them flexible and easy to fit to biological phenomena, although they can only answer qualitative questions. To under stand and manipulate behaviours that depend on finer timing and exact molecular concentrations, a second class of models was developed -continuous models. For example, to simulate the effects of dietary restriction on yeast cells under different nutrient concentrations 1 , users must resort to the finer resolution of continuous models. A third class of models was introduced follow ing the observation that the functionality of regulatory networks is often affected by noise. As the majority of these models account for interactions between individual molecules, they are referred to here as singlemolecule level models. Singlemolecule level models explain the relationship between stochasticity and gene regulation. Predictive computational models of regulatory net works are expected to benefit several fields. In medi cine, mechanisms of diseases that are characterized by dysfunction of regulatory processes can be elucidated. Biotechnological projects can benefit from predictive models that will replace some tedious and costly lab experiments. And, computational analysis may con tribute to basic biological research, for example, by explaining developmental mechanisms or new aspects of the evolutionary process. Here we review the available methodologies for mod elling and analysing regulatory networks. These meth odologies have already proved to be a valuable research tool, both for the development of network models and for the analysis of their functionality. We discuss their relative advantages and limitations, and outline some open questions regarding regulatory networks, includ ing how structure, dynamics and functionality relate to each other, how organisms use regulatory networks to adapt to their environments, and the interplay between regulatory networks and other cellular processes, such as metabolism. Stochasticity The property of a system whose behaviour depends on probabilities. In a model with stochasticity, a single initial state can evolve into several different trajectories, each with an associated probability. Modelling and analysis of gene regulatory networks Guy Karlebach and Ron Shamir Abstract | Gene regulatory networks have an important role in every process of life, including cell differentiation, metabolism, the cell cycle and signal transduction. By understanding the dynamics of these networks we can shed light on the mechanisms of diseases that occur when these cellular processes are dysregulated. Accurate prediction of the behaviour of regulatory networks will also speed up biotechnological projects, as such predictions are quicker and cheaper than lab experiments. Computational methods, both for supporting the development of network models and for the analysis of their functionality, have already proved to be a valuable research tool.
Computation with finite stochastic chemical reaction networks
- Natural Computing
, 2008
"... Abstract. A highly desired part of the synthetic biology toolbox is an embedded chemical microcontroller, capable of autonomously following a logic program specified by a set of instructions, and interacting with its cellular environment. Strategies for incorporating logic in aqueous chemistry have ..."
Abstract
-
Cited by 53 (17 self)
- Add to MetaCart
(Show Context)
Abstract. A highly desired part of the synthetic biology toolbox is an embedded chemical microcontroller, capable of autonomously following a logic program specified by a set of instructions, and interacting with its cellular environment. Strategies for incorporating logic in aqueous chemistry have focused primarily on implementing components, such as logic gates, that are composed into larger circuits, with each logic gate in the circuit corresponding to one or more molecular species. With this paradigm, designing and producing new molecular species is necessary to perform larger computations. An alternative approach begins by noticing that chemical systems on the small scale are fundamentally discrete and stochastic. In particular, the exact molecular counts of each molecular species present, is an intrinsically available form of information. This might appear to be a very weak form of information, perhaps quite difficult for computations to utilize. Indeed, it has been shown that error-free Turing universal computation is impossible in this setting. Nevertheless, we show a design of a chemical computer that achieves fast and reliable Turing-universal computation using molecular counts. Our scheme uses only a small number of different molecular species to do computation of arbitrary complexity. The total probability of error of the computation can be made arbitrarily small (but not zero) by adjusting the initial molecular counts of certain species. While physical implementations would be difficult, these results demonstrate that molecular counts can be a useful form of information for small molecular systems such as those operating within cellular environments. Key words. stochastic chemical kinetics; molecular counts; Turing-universal computation; probabilistic computation 1. Introduction. Many
Rule-based modeling of biochemical systems with BioNetGen
- IN METHODS IN MOLECULAR BIOLOGY: SYSTEMS BIOLOGY
, 2009
"... Rule-based modeling involves the representation of molecules as structured objects and molecular interactions as rules for transforming the attributes of these objects. The approach is notable in that it allows one to systematically incorporate site-specific details about proteinprotein interactio ..."
Abstract
-
Cited by 43 (10 self)
- Add to MetaCart
Rule-based modeling involves the representation of molecules as structured objects and molecular interactions as rules for transforming the attributes of these objects. The approach is notable in that it allows one to systematically incorporate site-specific details about proteinprotein interactions into a model for the dynamics of a signal-transduction system, but the method has other applications as well, such as following the fates of individual carbon atoms in metabolic reactions. The consequences of protein-protein interactions are difficult to specify and track with a conventional modeling approach because of the large number of protein phosphoforms and protein complexes that these interactions potentially generate. Here, we focus on how a rule-based model is specified in the BioNetGen language (BNGL) and how a model specification is analyzed using the BioNetGen software tool. We also discuss new developments in rule-based modeling that should enable the construction and analyses of comprehensive models for signal transduction pathways and similarly large-scale models for other biochemical systems.
Simulating the evolution of soot mixing state with a particleresolved aerosol model,
- J. Geophys. Res.,
, 2009
"... [1] The mixing state of soot particles in the atmosphere is of crucial importance for assessing their climatic impact, since it governs their chemical reactivity, cloud condensation nuclei activity, and radiative properties. To improve the mixing state representation in models, we present a new app ..."
Abstract
-
Cited by 26 (7 self)
- Add to MetaCart
(Show Context)
[1] The mixing state of soot particles in the atmosphere is of crucial importance for assessing their climatic impact, since it governs their chemical reactivity, cloud condensation nuclei activity, and radiative properties. To improve the mixing state representation in models, we present a new approach, the stochastic particle-resolved model PartMC-MOSAIC, which explicitly resolves the composition of individual particles in a given population of different types of aerosol particles. This approach tracks the evolution of the mixing state of particles due to emission, dilution, condensation, and coagulation. To make this direct stochastic particle-based method practical, we implemented a new multiscale stochastic coagulation method. With this method we achieved high computational efficiency for situations when the coagulation kernel is highly nonuniform, as is the case for many realistic applications. PartMC-MOSAIC was applied to an idealized urban plume case representative of a large urban area to simulate the evolution of carbonaceous aerosols of different types due to coagulation and condensation. For this urban plume scenario we quantified the individual processes that contributed to the aging of the aerosol distribution, illustrating the capabilities of our modeling approach. The results showed for the first time the multidimensional structure of particle composition, which is usually lost in sectional or modal aerosol models. Citation: Riemer, N., M. West, R. A. Zaveri, and R. C. Easter (2009), Simulating the evolution of soot mixing state with a particleresolved aerosol model,
Modular Assembly of Cell Systems Biology Models Using P Systems
, 2009
"... All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately. ..."
Abstract
-
Cited by 15 (10 self)
- Add to MetaCart
(Show Context)
All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately.
Noise Analysis in Ligand-Binding Reception for Molecular Communication in Nanonetworks
, 2010
"... Abstract—Molecular communication (MC) will enable the exchange of information among nanoscale devices. In this novel bioinspired communication paradigm, molecules are employed to encode, transmit and receive information. In the most general case, these molecules are propagated in the medium by means ..."
Abstract
-
Cited by 15 (3 self)
- Add to MetaCart
(Show Context)
Abstract—Molecular communication (MC) will enable the exchange of information among nanoscale devices. In this novel bioinspired communication paradigm, molecules are employed to encode, transmit and receive information. In the most general case, these molecules are propagated in the medium by means of free diffusion. An information theoretical analysis of diffusion-based MC is required to better understand the potential of this novel communication mechanism. The study and the modeling of the noise sources is of utmost importance for this analysis. The objective of this paper is to provide a mathematical study of the noise at the reception of the molecular information in a diffusion-based MC system when the ligand-binding reception is employed. The reference diffusion-based MC system for this analysis is the physical end-to-end model introduced in a previous work by the same authors, where the reception process is realized through ligandbinding chemical receptors. The reception noise is modeled in this paper by following two different approaches, namely, through the ligand-receptor kinetics and through the stochastic chemical kinetics. The ligand-receptor kinetics allows to simulate the random perturbations in the chemical processes of the reception, while the stochastic chemical kinetics provides the tools to derive a closedform solution to the modeling of the reception noise. The ligand-receptor kinetics model is expressed through a block scheme, while the stochastic chemical kinetics results in the characterization of the reception noise using stochastic differential equations. Numerical results are provided to demonstrate that the analytical formulation of the reception noise in terms of stochastic chemical kinetics is compliant with the reception noise behavior resulting from the ligand-receptor kinetics simulations. Index Terms—Chemical master equation, diffusion, ligand-receptor kinetics, molecular communication, nanonetworks, nanotechnology,
Computing Reachable States for Nonlinear Biological Models
, 2010
"... In this paper we describe reachability computation for continuous and hybrid systems and its potential contribution to the process of building and debugging biological models. We summarize the state-of-the-art for linear systems and then develop a novel algorithm for computing reachable states for n ..."
Abstract
-
Cited by 14 (5 self)
- Add to MetaCart
(Show Context)
In this paper we describe reachability computation for continuous and hybrid systems and its potential contribution to the process of building and debugging biological models. We summarize the state-of-the-art for linear systems and then develop a novel algorithm for computing reachable states for nonlinear systems. We report experimental results obtained using a prototype implementation applied to several biological models. We believe these results constitute a promising contribution to the analysis of complex models of biological systems.
Estimating black carbon aging time-scales with a particle-resolved aerosol model
- Aerosol Science
, 2010
"... Understanding the aging process of aerosol particles is important for assessing their chemical reactivity, cloud condensation nuclei activity, radiative properties and health impacts. In this study we investigate the aging of black carbon containing particles in an idealized urban plume using a new ..."
Abstract
-
Cited by 13 (2 self)
- Add to MetaCart
(Show Context)
Understanding the aging process of aerosol particles is important for assessing their chemical reactivity, cloud condensation nuclei activity, radiative properties and health impacts. In this study we investigate the aging of black carbon containing particles in an idealized urban plume using a new approach, the particle-resolved aerosol model PartMC-MOSAIC. We present a method to estimate aging time-scales using an aging criterion based on cloud condensation nuclei activation. The results show a separation into a daytime regime where condensation dominates and a nighttime regime where coagulation dominates. There is also a strong dependence on supersaturation threshold. For the chosen urban plume scenario and supersaturations ranging from 0.1% to 1%, the aging time-scales vary between 11 and 0.068 h during the day, and between 54 and 6.4 h during the night.
Crossing the Mesoscale No-Man’s Land via Parallel Kinetic Monte Carlo
, 2009
"... The kinetic Monte Carlo method and its variants are powerful tools for modeling materials at the mesoscale, meaning at length and time scales in between the atomic and continuum. We have completed a 3 year LDRD project with the goal of developing a parallel kinetic Monte Carlo capability and applyin ..."
Abstract
-
Cited by 11 (0 self)
- Add to MetaCart
(Show Context)
The kinetic Monte Carlo method and its variants are powerful tools for modeling materials at the mesoscale, meaning at length and time scales in between the atomic and continuum. We have completed a 3 year LDRD project with the goal of developing a parallel kinetic Monte Carlo capability and applying it to materials modeling problems of interest to Sandia. In this report we give an overview of the methods and algorithms developed, and describe our new open-source code called SPPARKS, for Stochastic Parallel PARticle Kinetic Simulator. We also highlight the development of several Monte Carlo models in SPPARKS for specific materials modeling applications, including grain growth, bubble formation, diffusion in nanoporous materials, defect formation in erbium hydrides, and surface growth
Learning networks of stochastic differential equations.
- In Advances in Neural Information Processing Systems (NIPS),
, 2010
"... Abstract We consider linear models for stochastic dynamics. To any such model can be associated a network (namely a directed graph) describing which degrees of freedom interact under the dynamics. We tackle the problem of learning such a network from observation of the system trajectory over a time ..."
Abstract
-
Cited by 8 (2 self)
- Add to MetaCart
(Show Context)
Abstract We consider linear models for stochastic dynamics. To any such model can be associated a network (namely a directed graph) describing which degrees of freedom interact under the dynamics. We tackle the problem of learning such a network from observation of the system trajectory over a time interval T . We analyze the ℓ 1 -regularized least squares algorithm and, in the setting in which the underlying network is sparse, we prove performance guarantees that are uniform in the sampling rate as long as this is sufficiently high. This result substantiates the notion of a well defined 'time complexity' for the network inference problem. keywords: Gaussian processes, model selection and structure learning, graphical models, sparsity and feature selection.