Subscribe: PLoS Computational Biology: New Articles;jsessionid=D2FC1DCEC09465F9F789EAE52B2A108F
Added By: Feedage Forager Feedage Grade B rated
based  cell  data  dna  effects  framework  human  information  iron  model  models  network  new  scale  structure  xml 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: PLoS Computational Biology: New Articles

PLOS Computational Biology: New Articles

A Peer-Reviewed Open-Access Journal

Updated: 2018-04-24T23:16:32Z


(XML) Switchable slow cellular conductances determine robustness and tunability of network states


by Guillaume Drion, Julie Dethier, Alessio Franci, Rodolphe Sepulchre

Neuronal information processing is regulated by fast and localized fluctuations of brain states. Brain states reliably switch between distinct spatiotemporal signatures at a network scale even though they are composed of heterogeneous and variable rhythms at a cellular scale. We investigated the mechanisms of this network control in a conductance-based population model that reliably switches between active and oscillatory mean-fields. Robust control of the mean-field properties relies critically on a switchable negative intrinsic conductance at the cellular level. This conductance endows circuits with a shared cellular positive feedback that can switch population rhythms on and off at a cellular resolution. The switch is largely independent from other intrinsic neuronal properties, network size and synaptic connectivity. It is therefore compatible with the temporal variability and spatial heterogeneity induced by slower regulatory functions such as neuromodulation, synaptic plasticity and homeostasis. Strikingly, the required cellular mechanism is available in all cell types that possess T-type calcium channels but unavailable in computational models that neglect the slow kinetics of their activation.(image)

(XML) Identifying robust hysteresis in networks


by Tomáš Gedeon, Bree Cummins, Shaun Harker, Konstantin Mischaikow

We present a new modeling and computational tool that computes rigorous summaries of network dynamics over large sets of parameter values. These summaries, organized in a database, can be searched for observed dynamics, e.g., bistability and hysteresis, to discover parameter regimes over which they are supported. We illustrate our approach on several networks underlying the restriction point of the cell cycle in humans and yeast. We rank networks by how robustly they support hysteresis, which is the observed phenotype. We find that the best 6-node human network and the yeast network share similar topology and robustness of hysteresis, in spite of having no homology between the corresponding nodes of the network. Our approach provides a new tool linking network structure and dynamics.(image)

(XML) Black-boxing and cause-effect power


by William Marshall, Larissa Albantakis, Giulio Tononi

Reductionism assumes that causation in the physical world occurs at the micro level, excluding the emergence of macro-level causation. We challenge this reductionist assumption by employing a principled, well-defined measure of intrinsic cause-effect power–integrated information (Φ), and showing that, according to this measure, it is possible for a macro level to “beat” the micro level. Simple systems were evaluated for Φ across different spatial and temporal scales by systematically considering all possible black boxes. These are macro elements that consist of one or more micro elements over one or more micro updates. Cause-effect power was evaluated based on the inputs and outputs of the black boxes, ignoring the internal micro elements that support their input-output function. We show how black-box elements can have more common inputs and outputs than the corresponding micro elements, revealing the emergence of high-order mechanisms and joint constraints that are not apparent at the micro level. As a consequence, a macro, black-box system can have higher Φ than its micro constituents by having more mechanisms (higher composition) that are more interconnected (higher integration). We also show that, for a given micro system, one can identify local maxima of Φ across several spatiotemporal scales. The framework is demonstrated on a simple biological system, the Boolean network model of the fission-yeast cell-cycle, for which we identify stable local maxima during the course of its simulated biological function. These local maxima correspond to macro levels of organization at which emergent cause-effect properties of physical systems come into focus, and provide a natural vantage point for scientific inquiries.(image)

(XML) Computational mechanisms underlying cortical responses to the affordance properties of visual scenes


by Michael F. Bonner, Russell A. Epstein

Biologically inspired deep convolutional neural networks (CNNs), trained for computer vision tasks, have been found to predict cortical responses with remarkable accuracy. However, the internal operations of these models remain poorly understood, and the factors that account for their success are unknown. Here we develop a set of techniques for using CNNs to gain insights into the computational mechanisms underlying cortical responses. We focused on responses in the occipital place area (OPA), a scene-selective region of dorsal occipitoparietal cortex. In a previous study, we showed that fMRI activation patterns in the OPA contain information about the navigational affordances of scenes; that is, information about where one can and cannot move within the immediate environment. We hypothesized that this affordance information could be extracted using a set of purely feedforward computations. To test this idea, we examined a deep CNN with a feedforward architecture that had been previously trained for scene classification. We found that responses in the CNN to scene images were highly predictive of fMRI responses in the OPA. Moreover the CNN accounted for the portion of OPA variance relating to the navigational affordances of scenes. The CNN could thus serve as an image-computable candidate model of affordance-related responses in the OPA. We then ran a series of in silico experiments on this model to gain insights into its internal operations. These analyses showed that the computation of affordance-related features relied heavily on visual information at high-spatial frequencies and cardinal orientations, both of which have previously been identified as low-level stimulus preferences of scene-selective visual cortex. These computations also exhibited a strong preference for information in the lower visual field, which is consistent with known retinotopic biases in the OPA. Visualizations of feature selectivity within the CNN suggested that affordance-based responses encoded features that define the layout of the spatial environment, such as boundary-defining junctions and large extended surfaces. Together, these results map the sensory functions of the OPA onto a fully quantitative model that provides insights into its visual computations. More broadly, they advance integrative techniques for understanding visual cortex across multiple level of analysis: from the identification of cortical sensory functions to the modeling of their underlying algorithms.(image)

(XML) Correcting for batch effects in case-control microbiome studies


by Sean M. Gibbons, Claire Duvallet, Eric J. Alm

High-throughput data generation platforms, like mass-spectrometry, microarrays, and second-generation sequencing are susceptible to batch effects due to run-to-run variation in reagents, equipment, protocols, or personnel. Currently, batch correction methods are not commonly applied to microbiome sequencing datasets. In this paper, we compare different batch-correction methods applied to microbiome case-control studies. We introduce a model-free normalization procedure where features (i.e. bacterial taxa) in case samples are converted to percentiles of the equivalent features in control samples within a study prior to pooling data across studies. We look at how this percentile-normalization method compares to traditional meta-analysis methods for combining independent p-values and to limma and ComBat, widely used batch-correction models developed for RNA microarray data. Overall, we show that percentile-normalization is a simple, non-parametric approach for correcting batch effects and improving sensitivity in case-control meta-analyses.(image)

(XML) A machine learning based framework to identify and classify long terminal repeat retrotransposons


by Leander Schietgat, Celine Vens, Ricardo Cerri, Carlos N. Fischer, Eduardo Costa, Jan Ramon, Claudia M. A. Carareto, Hendrik Blockeel

Transposable elements (TEs) are repetitive nucleotide sequences that make up a large portion of eukaryotic genomes. They can move and duplicate within a genome, increasing genome size and contributing to genetic diversity within and across species. Accurate identification and classification of TEs present in a genome is an important step towards understanding their effects on genes and their role in genome evolution. We introduce TE-Learner, a framework based on machine learning that automatically identifies TEs in a given genome and assigns a classification to them. We present an implementation of our framework towards LTR retrotransposons, a particular type of TEs characterized by having long terminal repeats (LTRs) at their boundaries. We evaluate the predictive performance of our framework on the well-annotated genomes of Drosophila melanogaster and Arabidopsis thaliana and we compare our results for three LTR retrotransposon superfamilies with the results of three widely used methods for TE identification or classification: RepeatMasker, Censor and LtrDigest. In contrast to these methods, TE-Learner is the first to incorporate machine learning techniques, outperforming these methods in terms of predictive performance, while able to learn models and make predictions efficiently. Moreover, we show that our method was able to identify TEs that none of the above method could find, and we investigated TE-Learner’s predictions which did not correspond to an official annotation. It turns out that many of these predictions are in fact strongly homologous to a known TE.(image)

(XML) DIVERSITY in binding, regulation, and evolution revealed from high-throughput ChIP


by Sneha Mitra, Anushua Biswas, Leelavati Narlikar

Genome-wide in vivo protein-DNA interactions are routinely mapped using high-throughput chromatin immunoprecipitation (ChIP). ChIP-reported regions are typically investigated for enriched sequence-motifs, which are likely to model the DNA-binding specificity of the profiled protein and/or of co-occurring proteins. However, simple enrichment analyses can miss insights into the binding-activity of the protein. Note that ChIP reports regions making direct contact with the protein as well as those binding through intermediaries. For example, consider a ChIP experiment targeting protein X, which binds DNA at its cognate sites, but simultaneously interacts with four other proteins. Each of these proteins also binds to its own specific cognate sites along distant parts of the genome, a scenario consistent with the current view of transcriptional hubs and chromatin loops. Since ChIP will pull down all X-associated regions, the final reported data will be a union of five distinct sets of regions, each containing binding sites of one of the five proteins, respectively. Characterizing all five different motifs and the corresponding sets is important to interpret the ChIP experiment and ultimately, the role of X in regulation. We present diversity which attempts exactly this: it partitions the data so that each partition can be characterized with its own de novo motif. Diversity uses a Bayesian approach to identify the optimal number of motifs and the associated partitions, which together explain the entire dataset. This is in contrast to standard motif finders, which report motifs individually enriched in the data, but do not necessarily explain all reported regions. We show that the different motifs and associated regions identified by diversity give insights into the various complexes that may be forming along the chromatin, something that has so far not been attempted from ChIP data. Webserver at; standalone (Mac OS X/Linux) from

(XML) Exploiting glycan topography for computational design of Env glycoprotein antigenicity


by Wen-Han Yu, Peng Zhao, Monia Draghi, Claudia Arevalo, Christina B. Karsten, Todd J. Suscovich, Bronwyn Gunn, Hendrik Streeck, Abraham L. Brass, Michael Tiemeyer, Michael Seaman, John R. Mascola, Lance Wells, Douglas A. Lauffenburger, Galit Alter

Mounting evidence suggests that glycans, rather than merely serving as a “shield”, contribute critically to antigenicity of the HIV envelope (Env) glycoprotein, representing critical antigenic determinants for many broadly neutralizing antibodies (bNAbs). While many studies have focused on defining the role of individual glycans or groups of proximal glycans in bNAb binding, little is known about the effects of changes in the overall glycan landscape in modulating antibody access and Env antigenicity. Here we developed a systems glycobiology approach to reverse engineer the complexity of HIV glycan heterogeneity to guide antigenicity-based de novo glycoprotein design. bNAb binding was assessed against a panel of 94 recombinant gp120 monomers exhibiting defined glycan site occupancies. Using a Bayesian machine learning algorithm, bNAb-specific glycan footprints were identified and used to design antigens that selectively alter bNAb antigenicity as a proof-of concept. Our approach provides a new design strategy to predictively modulate antigenicity via the alteration of glycan topography, thereby focusing the humoral immune response on sites of viral vulnerability for HIV.(image)

(XML) Divergent genome evolution caused by regional variation in DNA gain and loss between human and mouse


by Reuben M. Buckley, R. Daniel Kortschak, David L. Adelson

The forces driving the accumulation and removal of non-coding DNA and ultimately the evolution of genome size in complex organisms are intimately linked to genome structure and organisation. Our analysis provides a novel method for capturing the regional variation of lineage-specific DNA gain and loss events in their respective genomic contexts. To further understand this connection we used comparative genomics to identify genome-wide individual DNA gain and loss events in the human and mouse genomes. Focusing on the distribution of DNA gains and losses, relationships to important structural features and potential impact on biological processes, we found that in autosomes, DNA gains and losses both followed separate lineage-specific accumulation patterns. However, in both species chromosome X was particularly enriched for DNA gain, consistent with its high L1 retrotransposon content required for X inactivation. We found that DNA loss was associated with gene-rich open chromatin regions and DNA gain events with gene-poor closed chromatin regions. Additionally, we found that DNA loss events tended to be smaller than DNA gain events suggesting that they were able to accumulate in gene-rich open chromatin regions due to their reduced capacity to interrupt gene regulatory architecture. GO term enrichment showed that mouse loss hotspots were strongly enriched for terms related to developmental processes. However, these genes were also located in regions with a high density of conserved elements, suggesting that despite high levels of DNA loss, gene regulatory architecture remained conserved. This is consistent with a model in which DNA gain and loss results in turnover or “churning” in regulatory element dense regions of open chromatin, where interruption of regulatory elements is selected against.(image)

(XML) EmbryoMiner: A new framework for interactive knowledge discovery in large-scale cell tracking data of developing embryos


by Benjamin Schott, Manuel Traub, Cornelia Schlagenhauf, Masanari Takamiya, Thomas Antritter, Andreas Bartschat, Katharina Löffler, Denis Blessing, Jens C. Otte, Andrei Y. Kobitski, G. Ulrich Nienhaus, Uwe Strähle, Ralf Mikut, Johannes Stegmaier

State-of-the-art light-sheet and confocal microscopes allow recording of entire embryos in 3D and over time (3D+t) for many hours. Fluorescently labeled structures can be segmented and tracked automatically in these terabyte-scale 3D+t images, resulting in thousands of cell migration trajectories that provide detailed insights to large-scale tissue reorganization at the cellular level. Here we present EmbryoMiner, a new interactive open-source framework suitable for in-depth analyses and comparisons of entire embryos, including an extensive set of trajectory features. Starting at the whole-embryo level, the framework can be used to iteratively focus on a region of interest within the embryo, to investigate and test specific trajectory-based hypotheses and to extract quantitative features from the isolated trajectories. Thus, the new framework provides a valuable new way to quantitatively compare corresponding anatomical regions in different embryos that were manually selected based on biological prior knowledge. As a proof of concept, we analyzed 3D+t light-sheet microscopy images of zebrafish embryos, showcasing potential user applications that can be performed using the new framework.(image)

(XML) Compositional clustering in task structure learning


by Nicholas T. Franklin, Michael J. Frank

Humans are remarkably adept at generalizing knowledge between experiences in a way that can be difficult for computers. Often, this entails generalizing constituent pieces of experiences that do not fully overlap, but nonetheless share useful similarities with, previously acquired knowledge. However, it is often unclear how knowledge gained in one context should generalize to another. Previous computational models and data suggest that rather than learning about each individual context, humans build latent abstract structures and learn to link these structures to arbitrary contexts, facilitating generalization. In these models, task structures that are more popular across contexts are more likely to be revisited in new contexts. However, these models can only re-use policies as a whole and are unable to transfer knowledge about the transition structure of the environment even if only the goal has changed (or vice-versa). This contrasts with ecological settings, where some aspects of task structure, such as the transition function, will be shared between context separately from other aspects, such as the reward function. Here, we develop a novel non-parametric Bayesian agent that forms independent latent clusters for transition and reward functions, affording separable transfer of their constituent parts across contexts. We show that the relative performance of this agent compared to an agent that jointly clusters reward and transition functions depends environmental task statistics: the mutual information between transition and reward functions and the stochasticity of the observations. We formalize our analysis through an information theoretic account of the priors, and propose a meta learning agent that dynamically arbitrates between strategies across task domains to optimize a statistical tradeoff.(image)

(XML) Decision making improves sperm chemotaxis in the presence of noise


by Justus A. Kromer, Steffen Märcker, Steffen Lange, Christel Baier, Benjamin M. Friedrich

To navigate their surroundings, cells rely on sensory input that is corrupted by noise. In cells performing chemotaxis, such noise arises from the stochastic binding of signalling molecules at low chemoattractant concentrations. We reveal a fundamental relationship between the speed of chemotactic steering and the strength of directional fluctuations that result from the amplification of noise in a chemical input signal. This relation implies a trade-off between steering that is slow and reliable, and steering that is fast but less reliable. We show that dynamic switching between these two modes of steering can substantially increase the probability to find a target, such as an egg to be found by sperm cells. This decision making confers no advantage in the absence of noise, but is beneficial when chemical signals are detectable, yet characterized by low signal-to-noise ratios. The latter applies at intermediate distances from a target, where signalling molecules are diluted, thus defining a ‘noise zone’ that cells have to cross. Our results explain decision making observed in recent experiments on sea urchin sperm chemotaxis. More generally, our theory demonstrates how decision making enables chemotactic agents to cope with high levels of noise in gradient sensing by dynamically adjusting the persistence length of a biased random walk.(image)

(XML) Optimal dynamic control approach in a multi-objective therapeutic scenario: Application to drug delivery in the treatment of prostate cancer


by Itziar Irurzun-Arana, Alvaro Janda, Sergio Ardanza-Trevijano, Iñaki F. Trocóniz

Numerous problems encountered in computational biology can be formulated as optimization problems. In this context, optimization of drug release characteristics or dosing schedules for anticancer agents has become a prominent area not only for the development of new drugs, but also for established drugs. However, in complex systems, optimization of drug exposure is not a trivial task and cannot be efficiently addressed through trial-error simulation exercises. Finding a solution to those problems is a challenging task which requires more advanced strategies like optimal control theory. In this work, we perform an optimal control analysis on a previously developed computational model for the testosterone effects of Triptorelin in prostate cancer patients with the goal of finding optimal drug-release characteristics. We demonstrate how numerical control optimization of non-linear models can be used to find better therapeutic approaches in order to improve the final outcome of the patients.(image)

(XML) Bayesian reconstruction of transmission within outbreaks using genomic variants


by Nicola De Maio, Colin J. Worby, Daniel J. Wilson, Nicole Stoesser

Pathogen genome sequencing can reveal details of transmission histories and is a powerful tool in the fight against infectious disease. In particular, within-host pathogen genomic variants identified through heterozygous nucleotide base calls are a potential source of information to identify linked cases and infer direction and time of transmission. However, using such data effectively to model disease transmission presents a number of challenges, including differentiating genuine variants from those observed due to sequencing error, as well as the specification of a realistic model for within-host pathogen population dynamics. Here we propose a new Bayesian approach to transmission inference, BadTrIP (BAyesian epiDemiological TRansmission Inference from Polymorphisms), that explicitly models evolution of pathogen populations in an outbreak, transmission (including transmission bottlenecks), and sequencing error. BadTrIP enables the inference of host-to-host transmission from pathogen sequencing data and epidemiological data. By assuming that genomic variants are unlinked, our method does not require the computationally intensive and unreliable reconstruction of individual haplotypes. Using simulations we show that BadTrIP is robust in most scenarios and can accurately infer transmission events by efficiently combining information from genetic and epidemiological sources; thanks to its realistic model of pathogen evolution and the inclusion of epidemiological data, BadTrIP is also more accurate than existing approaches. BadTrIP is distributed as an open source package ( for the phylogenetic software BEAST2. We apply our method to reconstruct transmission history at the early stages of the 2014 Ebola outbreak, showcasing the power of within-host genomic variants to reconstruct transmission events.(image)

(XML) Biogeography and environmental conditions shape bacteriophage-bacteria networks across the human microbiome


by Geoffrey D. Hannigan, Melissa B. Duhaime, Danai Koutra, Patrick D. Schloss

Viruses and bacteria are critical components of the human microbiome and play important roles in health and disease. Most previous work has relied on studying bacteria and viruses independently, thereby reducing them to two separate communities. Such approaches are unable to capture how these microbial communities interact, such as through processes that maintain community robustness or allow phage-host populations to co-evolve. We implemented a network-based analytical approach to describe phage-bacteria network diversity throughout the human body. We built these community networks using a machine learning algorithm to predict which phages could infect which bacteria in a given microbiome. Our algorithm was applied to paired viral and bacterial metagenomic sequence sets from three previously published human cohorts. We organized the predicted interactions into networks that allowed us to evaluate phage-bacteria connectedness across the human body. We observed evidence that gut and skin network structures were person-specific and not conserved among cohabitating family members. High-fat diets appeared to be associated with less connected networks. Network structure differed between skin sites, with those exposed to the external environment being less connected and likely more susceptible to network degradation by microbial extinction events. This study quantified and contrasted the diversity of virome-microbiome networks across the human body and illustrated how environmental factors may influence phage-bacteria interactive dynamics. This work provides a baseline for future studies to better understand system perturbations, such as disease states, through ecological networks.(image)

(XML) Propagating annotations of molecular networks using in silico fragmentation


by Ricardo R. da Silva, Mingxun Wang, Louis-Félix Nothias, Justin J. J. van der Hooft, Andrés Mauricio Caraballo-Rodríguez, Evan Fox, Marcy J. Balunas, Jonathan L. Klassen, Norberto Peporine Lopes, Pieter C. Dorrestein

The annotation of small molecules is one of the most challenging and important steps in untargeted mass spectrometry analysis, as most of our biological interpretations rely on structural annotations. Molecular networking has emerged as a structured way to organize and mine data from untargeted tandem mass spectrometry (MS/MS) experiments and has been widely applied to propagate annotations. However, propagation is done through manual inspection of MS/MS spectra connected in the spectral networks and is only possible when a reference library spectrum is available. One of the alternative approaches used to annotate an unknown fragmentation mass spectrum is through the use of in silico predictions. One of the challenges of in silico annotation is the uncertainty around the correct structure among the predicted candidate lists. Here we show how molecular networking can be used to improve the accuracy of in silico predictions through propagation of structural annotations, even when there is no match to a MS/MS spectrum in spectral libraries. This is accomplished through creating a network consensus of re-ranked structural candidates using the molecular network topology and structural similarity to improve in silico annotations. The Network Annotation Propagation (NAP) tool is accessible through the GNPS web-platform

(XML) A computational model of shared fine-scale structure in the human connectome


by J. Swaroop Guntupalli, Ma Feilong, James V. Haxby

Variation in cortical connectivity profiles is typically modeled as having a coarse spatial scale parcellated into interconnected brain areas. We created a high-dimensional common model of the human connectome to search for fine-scale structure that is shared across brains. Projecting individual connectivity data into this new common model connectome accounts for substantially more variance in the human connectome than do previous models. This newly discovered shared structure is closely related to fine-scale distinctions in representations of information. These results reveal a shared fine-scale structure that is a major component of the human connectome that coexists with coarse-scale, areal structure. This shared fine-scale structure was not captured in previous models and was, therefore, inaccessible to analysis and study.(image)

(XML) On the role of extrinsic noise in microRNA-mediated bimodal gene expression


by Marco Del Giudice, Stefano Bo, Silvia Grigolon, Carla Bosia

Several studies highlighted the relevance of extrinsic noise in shaping cell decision making and differentiation in molecular networks. Bimodal distributions of gene expression levels provide experimental evidence of phenotypic differentiation, where the modes of the distribution often correspond to different physiological states of the system. We theoretically address the presence of bimodal phenotypes in the context of microRNA (miRNA)-mediated regulation. MiRNAs are small noncoding RNA molecules that downregulate the expression of their target mRNAs. The nature of this interaction is titrative and induces a threshold effect: below a given target transcription rate almost no mRNAs are free and available for translation. We investigate the effect of extrinsic noise on the system by introducing a fluctuating miRNA-transcription rate. We find that the presence of extrinsic noise favours the presence of bimodal target distributions which can be observed for a wider range of parameters compared to the case with intrinsic noise only and for lower miRNA-target interaction strength. Our results suggest that combining threshold-inducing interactions with extrinsic noise provides a simple and robust mechanism for obtaining bimodal populations without requiring fine tuning. Furthermore, we characterise the protein distribution’s dependence on protein half-life.(image)

(XML) A modelling approach for exploring muscle dynamics during cyclic contractions


by Stephanie A. Ross, Nilima Nigam, James M. Wakeling

Hill-type muscle models are widely used within the field of biomechanics to predict and understand muscle behaviour, and are often essential where muscle forces cannot be directly measured. However, these models have limited accuracy, particularly during cyclic contractions at the submaximal levels of activation that typically occur during locomotion. To address this issue, recent studies have incorporated effects into Hill-type models that are oftentimes neglected, such as size-dependent, history-dependent, and activation-dependent effects. However, the contribution of these effects on muscle performance has yet to be evaluated under common contractile conditions that reflect the range of activations, strains, and strain rates that occur in vivo. The purpose of this study was to develop a modelling framework to evaluate modifications to Hill-type muscle models when they contract in cyclic loops that are typical of locomotor muscle function. Here we present a modelling framework composed of a damped harmonic oscillator in series with a Hill-type muscle actuator that consists of a contractile element and parallel elastic element. The intrinsic force-length and force-velocity properties are described using Bezier curves where we present a system to relate physiological parameters to the control points for these curves. The muscle-oscillator system can be geometrically scaled while preserving dynamic and kinematic similarity to investigate the muscle size effects while controlling for the dynamics of the harmonic oscillator. The model is driven by time-varying muscle activations that cause the muscle to cyclically contract and drive the dynamics of the harmonic oscillator. Thus, this framework provides a platform to test current and future Hill-type model formulations and explore factors affecting muscle performance in muscles of different sizes under a range of cyclic contractile conditions.(image)

(XML) Allostery in the dengue virus NS3 helicase: Insights into the NTPase cycle from molecular simulations


by Russell B. Davidson, Josie Hendrix, Brian J. Geiss, Martin McCullagh

The C-terminus domain of non-structural 3 (NS3) protein of the Flaviviridae viruses (e.g. HCV, dengue, West Nile, Zika) is a nucleotide triphosphatase (NTPase) -dependent superfamily 2 (SF2) helicase that unwinds double-stranded RNA while translocating along the nucleic polymer. Due to these functions, NS3 is an important target for antiviral development yet the biophysics of this enzyme are poorly understood. Microsecond-long molecular dynamic simulations of the dengue NS3 helicase domain are reported from which allosteric effects of RNA and NTPase substrates are observed. The presence of a bound single-stranded RNA catalytically enhances the phosphate hydrolysis reaction by affecting the dynamics and positioning of waters within the hydrolysis active site. Coupled with results from the simulations, electronic structure calculations of the reaction are used to quantify this enhancement to be a 150-fold increase, in qualitative agreement with the experimental enhancement factor of 10–100. Additionally, protein-RNA interactions exhibit NTPase substrate-induced allostery, where the presence of a nucleotide (e.g. ATP or ADP) structurally perturbs residues in direct contact with the phosphodiester backbone of the RNA. Residue-residue network analyses highlight pathways of short ranged interactions that connect the two active sites. These analyses identify motif V as a highly connected region of protein structure through which energy released from either active site is hypothesized to move, thereby inducing the observed allosteric effects. These results lay the foundation for the design of novel allosteric inhibitors of NS3.(image)

(XML) Backbone Brackets and Arginine Tweezers delineate Class I and Class II aminoacyl tRNA synthetases


by Florian Kaiser, Sebastian Bittrich, Sebastian Salentin, Christoph Leberecht, V. Joachim Haupt, Sarah Krautwurst, Michael Schroeder, Dirk Labudde

The origin of the machinery that realizes protein biosynthesis in all organisms is still unclear. One key component of this machinery are aminoacyl tRNA synthetases (aaRS), which ligate tRNAs to amino acids while consuming ATP. Sequence analyses revealed that these enzymes can be divided into two complementary classes. Both classes differ significantly on a sequence and structural level, feature different reaction mechanisms, and occur in diverse oligomerization states. The one unifying aspect of both classes is their function of binding ATP. We identified Backbone Brackets and Arginine Tweezers as most compact ATP binding motifs characteristic for each Class. Geometric analysis shows a structural rearrangement of the Backbone Brackets upon ATP binding, indicating a general mechanism of all Class I structures. Regarding the origin of aaRS, the Rodin-Ohno hypothesis states that the peculiar nature of the two aaRS classes is the result of their primordial forms, called Protozymes, being encoded on opposite strands of the same gene. Backbone Brackets and Arginine Tweezers were traced back to the proposed Protozymes and their more efficient successors, the Urzymes. Both structural motifs can be observed as pairs of residues in contemporary structures and it seems that the time of their addition, indicated by their placement in the ancient aaRS, coincides with the evolutionary trace of Proto- and Urzymes.(image)

(XML) Using pseudoalignment and base quality to accurately quantify microbial community composition


by Mark Reppell, John Novembre

Pooled DNA from multiple unknown organisms arises in a variety of contexts, for example microbial samples from ecological or human health research. Determining the composition of pooled samples can be difficult, especially at the scale of modern sequencing data and reference databases. Here we propose a novel method for taxonomic profiling in pooled DNA that combines the speed and low-memory requirements of k-mer based pseudoalignment with a likelihood framework that uses base quality information to better resolve multiply mapped reads. We apply the method to the problem of classifying 16S rRNA reads using a reference database of known organisms, a common challenge in microbiome research. Using simulations, we show the method is accurate across a variety of read lengths, with different length reference sequences, at different sample depths, and when samples contain reads originating from organisms absent from the reference. We also assess performance in real 16S data, where we reanalyze previous genetic association data to show our method discovers a larger number of quantitative trait associations than other widely used methods. We implement our method in the software Karp, for k-mer based analysis of read pools, to provide a novel combination of speed and accuracy that is uniquely suited for enhancing discoveries in microbial studies.(image)

(XML) Cell adhesion and fluid flow jointly initiate genotype spatial distribution in biofilms


by Ricardo Martínez-García, Carey D. Nadell, Raimo Hartmann, Knut Drescher, Juan A. Bonachela

Biofilms are microbial collectives that occupy a diverse array of surfaces. It is well known that the function and evolution of biofilms are strongly influenced by the spatial arrangement of different strains and species within them, but how spatiotemporal distributions of different genotypes in biofilm populations originate is still underexplored. Here, we study the origins of biofilm genetic structure by combining model development, numerical simulations, and microfluidic experiments using the human pathogen Vibrio cholerae. Using spatial correlation functions to quantify the differences between emergent cell lineage segregation patterns, we find that strong adhesion often, but not always, maximizes the size of clonal cell clusters on flat surfaces. Counterintuitively, our model predicts that, under some conditions, investing in adhesion can reduce rather than increase clonal group size. Our results emphasize that a complex interaction between fluid flow and cell adhesiveness can underlie emergent patterns of biofilm genetic structure. This structure, in turn, has an outsize influence on how biofilm-dwelling populations function and evolve.(image)

(XML) Multiscale modelization in a small virus: Mechanism of proton channeling and its role in triggering capsid disassembly


by Juan Francisco Viso, Patricia Belelli, Matías Machado, Humberto González, Sergio Pantano, María Julia Amundarain, Fernando Zamarreño, Maria Marta Branda, Diego M. A. Guérin, Marcelo D. Costabel

In this work, we assess a previously advanced hypothesis that predicts the existence of ion channels in the capsid of small and non-enveloped icosahedral viruses. With this purpose we examine Triatoma Virus (TrV) as a case study. This virus has a stable capsid under highly acidic conditions but disassembles and releases the genome in alkaline environments. Our calculations range from a subtle sub-atomic proton interchange to the dismantling of a large-scale system representing several million of atoms. Our results provide structure-based explanations for the three roles played by the capsid to enable genome release. First, we observe, for the first time, the formation of a hydrophobic gate in the cavity along the five-fold axis of the wild-type virus capsid, which can be disrupted by an ion located in the pore. Second, the channel enables protons to permeate the capsid through a unidirectional Grotthuss-like mechanism, which is the most likely process through which the capsid senses pH. Finally, assuming that the proton leak promotes a charge imbalance in the interior of the capsid, we model an internal pressure that forces shell cracking using coarse-grained simulations. Although qualitatively, this last step could represent the mechanism of capsid opening that allows RNA release. All of our calculations are in agreement with current experimental data obtained using TrV and describe a cascade of events that could explain the destabilization and disassembly of similar icosahedral viruses.(image)

(XML) Whole-body iron transport and metabolism: Mechanistic, multi-scale model to improve treatment of anemia in chronic kidney disease


by Joydeep Sarkar, Alka A. Potdar, Gerald M. Saidel

Iron plays vital roles in the human body including enzymatic processes, oxygen-transport via hemoglobin and immune response. Iron metabolism is characterized by ~95% recycling and minor replenishment through diet. Anemia of chronic kidney disease (CKD) is characterized by a lack of synthesis of erythropoietin leading to reduced red blood cell (RBC) formation and aberrant iron recycling. Treatment of CKD anemia aims to normalize RBC count and serum hemoglobin. Clinically, the various fluxes of iron transport and accumulation are not measured so that changes during disease (e.g., CKD) and treatment are unknown. Unwanted iron accumulation in patients is known to lead to adverse effects. Current whole-body models lack the mechanistic details of iron transport related to RBC maturation, transferrin (Tf and TfR) dynamics and assume passive iron efflux from macrophages. Hence, they are not predictive of whole-body iron dynamics and cannot be used to design individualized patient treatment. For prediction, we developed a mechanistic, multi-scale computational model of whole-body iron metabolism incorporating four compartments containing major pools of iron and RBC generation process. The model accounts for multiple forms of iron in vivo, mechanisms involved in iron uptake and release and their regulation. Furthermore, the model is interfaced with drug pharmacokinetics to allow simulation of treatment dynamics. We calibrated our model with experimental and clinical data from peer-reviewed literature to reliably simulate CKD anemia and the effects of current treatment involving combination of epoietin-alpha and iron dextran. This in silico whole-body model of iron metabolism predicts that a year of treatment can potentially lead to 90% downregulation of ferroportin (FPN) levels, 15-fold increase in iron stores with only a 20% increase in iron flux from the reticulo-endothelial system (RES). Model simulations quantified unmeasured iron fluxes, previously unknown effects of treatment on FPN-level and iron stores in the RES. This mechanistic whole-body model can be the basis for future studies that incorporate iron metabolism together with related clinical experiments. Such an approach could pave the way for development of effective personalized treatment of CKD anemia.(image)

(XML) A computational study of astrocytic glutamate influence on post-synaptic neuronal excitability


by Bronac Flanagan, Liam McDaid, John Wade, KongFatt Wong-Lin, Jim Harkin

The ability of astrocytes to rapidly clear synaptic glutamate and purposefully release the excitatory transmitter is critical in the functioning of synapses and neuronal circuits. Dysfunctions of these homeostatic functions have been implicated in the pathology of brain disorders such as mesial temporal lobe epilepsy. However, the reasons for these dysfunctions are not clear from experimental data and computational models have been developed to provide further understanding of the implications of glutamate clearance from the extracellular space, as a result of EAAT2 downregulation: although they only partially account for the glutamate clearance process. In this work, we develop an explicit model of the astrocytic glutamate transporters, providing a more complete description of the glutamate chemical potential across the astrocytic membrane and its contribution to glutamate transporter driving force based on thermodynamic principles and experimental data. Analysis of our model demonstrates that increased astrocytic glutamate content due to glutamine synthetase downregulation also results in increased postsynaptic quantal size due to gliotransmission. Moreover, the proposed model demonstrates that increased astrocytic glutamate could prolong the time course of glutamate in the synaptic cleft and enhances astrocyte-induced slow inward currents, causing a disruption to the clarity of synaptic signalling and the occurrence of intervals of higher frequency postsynaptic firing. Overall, our work distilled the necessity of a low astrocytic glutamate concentration for reliable synaptic transmission of information and the possible implications of enhanced glutamate levels as in epilepsy.(image)

(XML) Origins of scale invariance in vocalization sequences and speech


by Fatemeh Khatami, Markus Wöhr, Heather L. Read, Monty A. Escabí

To communicate effectively animals need to detect temporal vocalization cues that vary over several orders of magnitude in their amplitude and frequency content. This large range of temporal cues is evident in the power-law scale-invariant relationship between the power of temporal fluctuations in sounds and the sound modulation frequency (f). Though various forms of scale invariance have been described for natural sounds, the origins and implications of scale invariant phenomenon remain unknown. Using animal vocalization sequences, including continuous human speech, and a stochastic model of temporal amplitude fluctuations we demonstrate that temporal acoustic edges are the primary acoustic cue accounting for the scale invariant phenomenon. The modulation spectrum of vocalization sequences and the model both exhibit a dual regime lowpass structure with a flat region at low modulation frequencies and scale invariant 1/f2 trend for high modulation frequencies. Moreover, we find a time-frequency tradeoff between the average vocalization duration of each vocalization sequence and the cutoff frequency beyond which scale invariant behavior is observed. These results indicate that temporal edges are universal features responsible for scale invariance in vocalized sounds. This is significant since temporal acoustic edges are salient perceptually and the auditory system could exploit such statistical regularities to minimize redundancies and generate compact neural representations of vocalized sounds.(image)

(XML) Biobeam—Multiplexed wave-optical simulations of light-sheet microscopy


by Martin Weigert, Kaushikaram Subramanian, Sebastian T. Bundschuh, Eugene W. Myers, Moritz Kreysing

Sample-induced image-degradation remains an intricate wave-optical problem in light-sheet microscopy. Here we present biobeam, an open-source software package that enables simulation of operational light-sheet microscopes by combining data from 105–106 multiplexed and GPU-accelerated point-spread-function calculations. The wave-optical nature of these simulations leads to the faithful reproduction of spatially varying aberrations, diffraction artifacts, geometric image distortions, adaptive optics, and emergent wave-optical phenomena, and renders image-formation in light-sheet microscopy computationally tractable.(image)

(XML) Principles that govern competition or co-existence in Rho-GTPase driven polarization


by Jian-Geng Chiou, Samuel A. Ramirez, Timothy C. Elston, Thomas P. Witelski, David G. Schaeffer, Daniel J. Lew

Rho-GTPases are master regulators of polarity establishment and cell morphology. Positive feedback enables concentration of Rho-GTPases into clusters at the cell cortex, from where they regulate the cytoskeleton. Different cell types reproducibly generate either one (e.g. the front of a migrating cell) or several clusters (e.g. the multiple dendrites of a neuron), but the mechanistic basis for unipolar or multipolar outcomes is unclear. The design principles of Rho-GTPase circuits are captured by two-component reaction-diffusion models based on conserved aspects of Rho-GTPase biochemistry. Some such models display rapid winner-takes-all competition between clusters, yielding a unipolar outcome. Other models allow prolonged co-existence of clusters. We investigate the behavior of a simple class of models and show that while the timescale of competition varies enormously depending on model parameters, a single factor explains a large majority of this variation. The dominant factor concerns the degree to which the maximal active GTPase concentration in a cluster approaches a “saturation point” determined by model parameters. We suggest that both saturation and the effect of saturation on competition reflect fundamental properties of the Rho-GTPase polarity machinery, regardless of the specific feedback mechanism, which predict whether the system will generate unipolar or multipolar outcomes.(image)

(XML) Assessing the durability and efficiency of landscape-based strategies to deploy plant resistance to pathogens


by Loup Rimbaud, Julien Papaïx, Jean-François Rey, Luke G. Barrett, Peter H. Thrall

Genetically-controlled plant resistance can reduce the damage caused by pathogens. However, pathogens have the ability to evolve and overcome such resistance. This often occurs quickly after resistance is deployed, resulting in significant crop losses and a continuing need to develop new resistant cultivars. To tackle this issue, several strategies have been proposed to constrain the evolution of pathogen populations and thus increase genetic resistance durability. These strategies mainly rely on varying different combinations of resistance sources across time (crop rotations) and space. The spatial scale of deployment can vary from multiple resistance sources occurring in a single cultivar (pyramiding), in different cultivars within the same field (cultivar mixtures) or in different fields (mosaics). However, experimental comparison of the efficiency (i.e. ability to reduce disease impact) and durability (i.e. ability to limit pathogen evolution and delay resistance breakdown) of landscape-scale deployment strategies presents major logistical challenges. Therefore, we developed a spatially explicit stochastic model able to assess the epidemiological and evolutionary outcomes of the four major deployment options described above, including both qualitative resistance (i.e. major genes) and quantitative resistance traits against several components of pathogen aggressiveness: infection rate, latent period duration, propagule production rate, and infectious period duration. This model, implemented in the R package landsepi, provides a new and useful tool to assess the performance of a wide range of deployment options, and helps investigate the effect of landscape, epidemiological and evolutionary parameters. This article describes the model and its parameterisation for rust diseases of cereal crops, caused by fungi of the genus Puccinia. To illustrate the model, we use it to assess the epidemiological and evolutionary potential of the combination of a major gene and different traits of quantitative resistance. The comparison of the four major deployment strategies described above will be the objective of future studies.(image)