XClose

The Centre for Computational Science

Home
Menu

The Centre for Computational Science -old

The CCS is concerned with many aspects of theoretical and computational science, from chemistry and physics to materials, life and biomedical sciences as well as informatics. We explore these domains through high performance, data intensive, supercomputing and distributed (grid/cloud) computing.

Our different computational techniques span time and length-scales from the macro- through the meso- to the nanoscale. We are committed to studying new approaches and techniques that bridge these scales.

clay

 

Latest News

  • We are delighted to announce the publication of a Theme Issue in the journal Philosophical Transactions of the Royal Society A. The issue is titled “Reliability and Reproducibility in Computational Science: Implementing Verification, Validation and Uncertainty Quantification in Silico” and a significant proportion of the papers contained therein stems from research led by UCL’s Centre for Computational Science through VECMA. The theme issue also stems from an event hosted by The Alan Turing Institute in London in early 2020 under the same title, where several of the issue’s contributors met to prepare for its production. The theme issue addresses the question whether the computational methods and models used today are sufficiently reliable to generate actionable results. The question is analyzed based on three notions of reliability. Verification (V): that the model correctly implements the intended theory. Validation (V): agreement between model and experiment. Uncertainty quantification (UQ): identification of the provenance and magnitude of errors within the model, in other words how accurately the model captures reality. VVUQ lies at the heart of VECMA’s mission statement, “to enable trust in computer simulations as tools in the decision-making process for scientists as well as for policy makers in an era where science is afflicted by the ‘reproducibility crisis’ ”. Read more about publication here.

hokusai

 

  • CovidSim, the model used to inform the UK Government’s response to the pandemic has been analyzed by researchers at UCL, Brunel University, the CWI institute in the Netherlands, the Poznan Supercomputing and Network Center and the University of Amsterdam, and has been found to contain a large degree of uncertainty in its predictions, leading it to seriously underestimate the first wave. The researchers who performed this study, members of the two EU consortia VECMA and CompBioMed, both of which are led by UCL, undertook an extensive parametric sensitivity analysis and uncertainty quantification of the publicly available code. The study concluded that quantifying the parametric input uncertainty is not sufficient, and that the effect of model structure and scenario uncertainty cannot be ignored when validating the model in a probabilistic sense. Motivated by this finding, the scientific teams of the two EU consortia call for a better public understanding of the inherent uncertainty of models predicting COVID-19 mortality rates, saying they should be regarded as “probabilistic” rather than being relied upon to produce a particular and specific outcome. They maintain that future forecasts used to inform government policy should provide the range of possible outcomes in terms of probabilities to provide a more realistic picture of the pandemic framed in terms of uncertainties. This study has now been published in Nature Computational Science and the article has been made freely accessible via this link.

coronavirus

sc20

 

  • In a recent study conducted at the CCS, we used multiscale modelling to gain insights into the driving forces behind the formation of different graphene morphologies. Using two polymers, poly-ethylene glycol and poly-vinyl alcohol, they conducted ‘bottom-up’ simulations, in which we observed how changes in the chemistry at the atomistic scale propagate up and ultimately shape the properties of graphene flakes and graphene oxide composites at much larger scales. Read more about the publication here and watch a video illustrating how simulation was used to observe self-assembly and dispersion of graphene oxide structures in polymers here.

 

  • Using the world’s most powerful supercomputers to tackle COVID-19: Professor Peter Coveney, who leads the EU H2020 Computational Biomedicine Centre of Excellence, and his colleagues at the UCL Centre for Computational Science, are part of a consortium of more than a hundred researchers from across the US and Europe, who are using an exceptional array of supercomputers – including the biggest one in Europe and the most powerful on the planet – to study several aspects of the virus and disease in detail.

summit

 

  • Dr Owain Kenway, our former PhD student between the years 2005-2009, is the newly appointed Director of Research Computing at UCL. While at CCS, Owain studied ways to efficiently run molecular dynamics simulations on a geographically distributed computational grid. In 2009 he completed his PhD thesis "Molecular dynamics simulations of complex systems including HIV-1 protease", following which he spent a brief period writing software for a joint UCL/LSU research project as a group systems administrator. He then joined Research Computing as a "User and applications support specialist" in 2010 and subsequently assumed the role of Research Computing Analyst. His last last appointment before Director was Research Computing Applications and Support Team Leader for Research IT Services at UCL.

 

  • Towards full-scale 3D high-fidelity simulations of the human vasulature: A major effort is now underway to perform the first full-scale 3D high-fidelity simulations of blood flow in the human vasculature. Led by Peter Coveney within his Centre for Computational Science (CCS) at University College London (UCL), this major, large scale team endeavour, involves colleagues and collaborators from across Europe and the USA. The members of the team include UCL, Leibniz Rechenzentrum (LRZ), Jülich Supercomputing Centre (JSC), the IT'IS Foundation and the University of Tennessee at Chattanooga. LRZ, a core partner in our CoE, is providing the use of their new supercomputer, SuperMUC-NG, through a large scale, Gauss Centre for Supercomputing, award for 2019-20 with Prof Dieter Kranzlmüller and Prof Peter Coveney. JSC is involved via our sister (CoE POP) and has been working with us to optimise the performance of the HemeLB code, with Dr Brian Wylie at JSC. The (IT’IS Foundation), based in Zürich, Switzerland, is an associate partner of our CoE and has provided the vascular model data. Dr Jon McCullough is performing the scientific studies within the CCS at UCL which currently address the coupling of arterial and venous trees.

supermuc

 

  • A joint UCL / Janssen team has performed a statistically robust head-to-head comparison between the Centre for Computational Science’s high performance computational method, TIES, and that offered by a commercial provider, Schrödinger Inc, FEP+, for calculating relative free energies of binding of candidate drugs to target proteins. TIES and FEP+ use different so-called “alchemical" free energy methods. A robust ensemble-based protocol is applied for the evaluation of the computational results and their associated errors. In the study, TIES produced the most reliable results. More compute-intensive "replica-exchange" methods, hard-wired within FEP+ but available optionally within TIES, manifest systematic underestimations of these free energy differences; in particular, FEP+ predictions degrade substantially as simulation times are extended. You can access the paper here.

 

  • Digital computers use numbers based on flawed representations of real numbers, which may lead to inaccuracies when simulating the motion of molecules, weather systems and fluids. Roger Highfield has posted an article called "The Problem with Digital Computers", based on the paper "A New Pathology in the Simulation of Chaotic Dynamical Systems on Digital Computers" by Bruce Boghosian, Peter Coveney, and Hongyan Wang, you can read the article here, and the paper itself here.

 

  • The Future of Quantum Computing: From Quantum Intelligence to Virtual Humans. Quantum computers represent a new way to process information. Might they be able to crack what are currently thought to be unbreakable codes and answer unanswerable questions? The Science Museum has joined forces with CompBioMed to assemble a panel of experts to explore the future of this incredible technology to mark the opening of its new exhibition, Top Secret: From ciphers to cyber security. Join us for this CompBioMed event at the Science Museum, IMAX theatre, 19.30-20.30, September 25th, 2019.

 

  • The CCS has been granted a new allocation on Summit, the world's most powerful supercomputer. The Oak Ridge Leadership Computing Facility’s (OLCF) Resource Utilization Council has approved the new project called IMPRESS. The allocation is for 25,000 Summit node hours and 2,500 Rhea node hours for 12 months.

 

News Archive

 

 

ahl
hiv
flow