BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//E-CAM - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:E-CAM
X-ORIGINAL-URL:https://www.e-cam2020.eu
X-WR-CALDESC:Events for E-CAM
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20160327T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20161030T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20170326T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20171029T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20180325T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20181028T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20190331T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20191027T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20200329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20201025T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20210328T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20211031T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20150101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;VALUE=DATE:20200217
DTEND;VALUE=DATE:20200229
DTSTAMP:20260430T215138
CREATED:20190108T152328Z
LAST-MODIFIED:20191122T160115Z
UID:3602-1581897600-1582934399@www.e-cam2020.eu
SUMMARY:Integration of ESL modules into electronic-structure codes
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button]\nIf you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nThe evolutionary pressure on electronic structure software development is greatly increasing\, due to the emergence of new paradigms\, new kinds of users\, new processes\, and new tools. Electronic structure software complexity is consequently also increasing\, requiring a larger effort on code maintenance. Developers of large electronic structure codes are trying to relieve some complexity by transitioning standardized algorithms into separate libraries [BigDFT-PSolver\, ELPA\, ELSI\, LibXC\, LibGridXC\, etc.]. This paradigm shift requires library developers to have a hybrid developer profile where the scientific and computational skill set becomes equally important. These topics have been extensively and publicly discussed between developers of various projects including ABINIT\, ASE\, ATK\, BigDFT\, CASTEP\, FHI-aims\, GPAW\, Octopus\, Quantum Espresso\, SIESTA\, and SPR-KKR. \nHigh-quality standardized libraries are not only a highly challenging effort lying at the hands of the library developers\, they also open possibilities for codes to take advantage of a standard way to access commonly used algorithms. Integration of these libraries\, however\, requires a significant initial effort that is often sacrificed for new developments that often not even reach the mainstream branch of the code. Additionally\, there are multiple challenges in adopting new libraries which have their roots in a variety of issues: installation\, data structures\, physical units and parallelism – all of which are code-dependent. On the other hand\, adoption of common libraries ensures the immediate propagation of improvements within the respective library’s field of research and ensures codes are up-to-date with much less effort [LibXC]. Indeed\, well-established libraries can have a huge impact on multiple scientific communities at once [PETSc]. \nIn the Electronic Structure community\, two issues are emerging. Libraries are being developed [esl\, esl-gitlab] but require an ongoing commitment from the community with respect to sharing the maintenance and development effort. Secondly\, existing codes will benefit from libraries by adopting their use. Both issues are mainly governed by the exposure of the libraries and the availability of library core developers\, which are typically researchers pressured by publication deliverables and fund-raising burdens. They are thus not able to commit a large fraction of their time to software development. \nAn effort to allow code developers to make use of\, and develop\, shared components is needed. This requires an efficient coordination between various elements: \n– A common and consistent code development infrastructure/education in terms of compilation\, installation\, testing and documentation.\n– How to use and integrate already published libraries into existing projects.\n– Creating long-lasting synergies between developers to reach a “critical mass” of component contributors.\n– Relevant quality metrics (“TRLs” and “SRLs”)\, to provide businesses with useful information . \nThis is what the Electronic Structure Library (ESL)[esl\, esl-gitlab] has been doing since 2014\, with a wiki\, a data-exchange standard\, refactoring code of global interest into integrated modules\, and regularly organizing workshops\, within a wider movement lead by the European eXtreme Data and Computing Initiative [exdci]. \n  \nReferences\n[BigDFT-PSolver] http://bigdft.org/Wiki/index.php?title=The_Solver_Package\n[ELPA] https://gitlab.mpcdf.mgp.de/elpa/elpa\n[ELSI] http://elsi-interchange.org\n[LibXC] http://www.tddft.org/programs/libxc/\n[LibGridXC] https://launchpad.net/libgridxc\n[PETSc] https://www.mcs.anl.gov/petsc/\n[esl] http://esl.cecam.org/\n[esl-gitlab] http://gitlab.e-cam2020.eu/esl\n[exdci] https://exdci.eu/newsroom/press-releases/exdci-towards-common-hpc-strategy-europe
URL:https://www.e-cam2020.eu/legacy_event/integration-esl-modules-electronic-structure-codes/
LOCATION:CECAM-HQ\, Ecole Polytechnique Fédérale de Lausanne\, Switzerland
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20191209
DTEND;VALUE=DATE:20191213
DTSTAMP:20260430T215138
CREATED:20190111T220805Z
LAST-MODIFIED:20190301T112644Z
UID:3669-1575849600-1576195199@www.e-cam2020.eu
SUMMARY:State-of-the art workshop: Challenges in Multiphase Flows
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button] \nIf you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nThe general topic of the event is computational methods to study multiphase flows [1\,2]. Such methods are applied in very different disciplines\, such as statistical physics\, materials science\, applied mathematics\, and engineering\, with applications ranging from geophysical to micro scales. Examples include volcano eruptions\, oil recovery\, and the dynamics of droplets on structured surfaces (“lotus effect”). The computational approaches to tackle these problems are as disparate as the phenomena themselves and the corresponding scientific communities\, which rarely communicate amongst each other. The purpose of this school and workshop is to bring these various practitioners together for a fruitful exchange with the aim of improving the methodological toolbox which is still facing significant problems. \nFrom the computational point of view\, three major approaches (which shall all be covered) are commonly used: (i) sharp interface methods that keep track of the interface position [3]; (ii) smeared interface methods\, which again may be subdivided into level set approaches [4-6] and methods based upon a Cahn-Hilliard free energy\, or similar (to be discussed in the next paragraph) and finally (iii) methods which average over several phases being present in one volume element [7-9]. \nConcerning Cahn-Hilliard based approaches and similar\, a whole plethora of methods has been developed. In metallurgy and other branches of materials science\, phase-field models are fairly popular and have been particularly successful in the prediction of solid structures and their dynamic formation [10-15]. For fluid systems\, the usual approach has been standard Computational Fluid Dynamics\, based upon Finite Elements / Finite Differences / Finite Volume discretizations. These have recently been generalized to also include thermal fluctuations [16]\, which are typically needed for modeling phenomenena in the soft-matter domain\, i.e. the micro- and nanoscale. Instead of using an Eulerian grid\, an alternative discretization of the Navier-Stokes equations is also possible in terms of Lagrangian particles; this is the so-called Smoothed Particle Hydrodynamics (SPH) method\, which has been used for macroscale multiphase flows for quite a while [17\,18]. An exciting recent development has generalized SPH to also include thermal fluctuations [20\,21]\, which was subsequently combined with the multiphase methodology [22\,23]. \nA substantial body of work is based on the Lattice Boltzmann method [24]. While the original version was for an ideal gas on the macroscale\, it has been generalized to include thermal fluctuations [25] and also multiphase flows\, where typically the Shan-Chen model [26]\, the Swift-Yeomans model [27\,28]\, or variants thereof [29\,30] are being used. Thermal fluctuations have been included as well [31]. Quite successful applications include spinodal decomposition [32]\, Pickering emulsions [33-35]\, and flow of droplets past structured surfaces [36]. The Lattice Boltzmann method is particularly well-suited for modern parallel computer architectures and hence considerations of computational efficiency have played an important role in the literature [37\,38]. \nA problem that has so far not been solved fully satisfactorily is the appearance of so-called “spurious currents” at an interface\, which are a mere discretization artifact. Though also present in standard grid-based CFD calculations [39]\, they seem to have mainly been discussed in the Lattice Boltzmann literature [40-42]. An important goal of the event will be to critically discuss such artifacts\, as well as issues of thermodynamic consistency. This will be targeted at (i) avenues toward systematic understanding\, reduction and ultimate elimination of such undesired effects\, but also at (ii) the more pragmatic question of how far these issues matter in practical applications. \n  \nReferences\n[1] Prosperetti\, A. & Tryggvason\, G.\, ed. (2009)\, Computational Methods for Multiphase Flow\, Cambridge University Press\, Cambridge; New York. \n[2] Tryggvason\, G.; Scardovelli\, R. & Zaleski\, S. (2011)\, Direct Numerical Simulations of Gas-Liquid Multiphase Flows\, Cambridge University Press\, Cambridge; New York. \n[3] Tryggvason\, G.; Bunner\, B.; Esmaeeli\, A.; Juric\, D.; Al-Rawahi\, N.; Tauber\, W.; Han\, J.; Nas\, S. & Jan\, Y. J. (2001)\, A Front-Tracking Method for the Computations of Multiphase Flow\, Journal of Computational Physics 169(2)\, 708–759. \n[4] Olsson\, E. & Kreiss\, G. (2005)\, A conservative level set method for two phase flow\, Journal of Computational Physics 210(1)\, 225–246. \n[5] Olsson\, E.; Kreiss\, G. & Zahedi\, S. (2007)\, A conservative level set method for two phase flow II\, Journal of Computational Physics 225(1)\, 785–807. \n[6] Zahedi\, S.; Gustavsson\, K. & Kreiss\, G. (2009)\, A conservative level set method for contact line dynamics\, Journal of Computational Physics 228(17)\, 6361–6375. \n[7] Hassanizadeh\, M. & Gray\, W. G. (1979)\, General conservation equations for multi-phase systems: 1. Averaging procedure\, Advances in Water Resources 2\, 131–144. \n[8] Hassanizadeh\, M. & Gray\, W. G. (1979)\, General conservation equations for multi-phase systems: 2. Mass\, momenta\, energy\, and entropy equations\, Advances in Water Resources 2\, 191–203. \n[9] Hassanizadeh\, M. & Gray\, W. G. (1980)\, General conservation equations for multi-phase systems: 3. Constitutive theory for porous media flow\, Advances in Water Resources 3(1)\, 25–40. \n[10] Echebarria\, B.; Folch\, R.; Karma\, A. & Plapp\, M. (2004)\, Quantitative phase-field model of alloy solidification\, Physical Review E 70(6)\, 061604. \n[11] Folch\, R. & Plapp\, M. (2005)\, Quantitative phase-field modeling of two-phase growth\, Physical Review E 72(1)\, 011602. \n[12] Plapp\, M. (2011)\, Unified derivation of phase-field models for alloy solidification from a grand-potential functional\, Physical Review E 84(3)\, 031601. \n[13] Steinbach\, I.; Pezzolla\, F.; Nestler\, B.; Seeºselberg\, M.; Prieler\, R.; Schmitz\, G. J. & Rezende\, J. L. L. (1996)\, A phase field concept for multiphase systems\, Physica D: Nonlinear Phenomena 94(3)\, 135–147. \n[14] Nestler\, B.; Garcke\, H. & Stinner\, B. (2005)\, Multicomponent alloy solidification: Phase-field modeling and simulations\, Physical Review E 71(4)\, 041609. \n[15] Janssens\, K. G. F. (2007)\, Computational Materials Engineering: An Introduction to Microstructure Evolution\, Academic Press\, Amsterdam; Boston. \n[16] Chaudhri\, A.; Bell\, J. B.; Garcia\, A. L. & Donev\, A. (2014)\, Modeling multiphase flow using fluctuating hydrodynamics\, Physical Review E 90(3)\, 033014. \n[17] Monaghan\, J. J. & Kocharyan\, A. (1995)\, SPH simulation of multi-phase flow\, Computer Physics Communications 87(1)\, 225–235. \n[18] Monaghan\, J. J. & Rafiee\, A. (2012)\, A simple SPH algorithm for multi-fluid flow with high density ratios\, International Journal for Numerical Methods in Fluids 71(5)\, 537–561. \n[19] Morris\, J. P. (2000)\, Simulating surface tension with smoothed particle hydrodynamics\, International Journal for Numerical Methods in Fluids 33(3)\, 333–353. \n[20] Espanol\, P. & Revenga\, M. (2003)\, Smoothed dissipative particle dynamics\, Physical Review E 67(2)\, 026705. \n[21] Vazquez-Quesada\, A.; Ellero\, M. & Espanol\, P. (2009)\, Consistent scaling of thermal fluctuations in smoothed dissipative particle dynamics\, The Journal of Chemical Physics 130(3)\, 034901. \n[22] Hu\, X. Y. & Adams\, N. A. (2006)\, A multi-phase SPH method for macroscopic and mesoscopic flows\, Journal of Computational Physics 213(2)\, 844–861. \n[23] Hu\, X. Y. & Adams\, N. A. (2007)\, An incompressible multi-phase SPH method\, Journal of Computational Physics 227(1)\, 264–278. \n[24] Krueger\, T.; Kusumaatmaja\, H.; Kuzmin\, A.; Shardt\, O.; Silva\, G. & Viggen\, E. M. (2017)\, The Lattice Boltzmann Method: Principles and Practice\, Springer International Publishing. \n[25] Duenweg\, B. & Ladd\, A. J. C. (2009)\, Lattice Boltzmann Simulations of Soft Matter Systems\, in Advanced Computer Simulation Approaches for Soft Matter Sciences III\, Springer\, Berlin\, Heidelberg\, \, pp. 89–166. \n[26] Shan\, X. & Chen\, H. (1994)\, Simulation of nonideal gases and liquid-gas phase transitions by the lattice Boltzmann equation\, Physical Review E 49(4)\, 2941–2948. \n[27] Swift\, M. R.; Osborn\, W. R. & Yeomans\, J. M. (1995)\, Lattice Boltzmann Simulation of Nonideal Fluids\, Physical Review Letters 75(5)\, 830–833. \n[28] Swift\, M. R.; Orlandini\, E.; Osborn\, W. R. & Yeomans\, J. M. (1996)\, Lattice Boltzmann simulations of liquid-gas and binary fluid systems\, Physical Review E 54(5)\, 5041–5052. \n[29] Sbragaglia\, M.; Benzi\, R.; Biferale\, L.; Succi\, S.; Sugiyama\, K. & Toschi\, F. (2007)\, Generalized lattice Boltzmann method with multirange pseudopotential\, Physical Review E 75(2)\, 026702. \n[30] Krueger\, T.; Frijters\, S.; Guenther\, F.; Kaoui\, B. & Harting\, J. (2013)\, Numerical simulations of complex fluid-fluid interface dynamics\, The European Physical Journal Special Topics 222(1)\, 177–198. \n[31] Thampi\, S. P.; Pagonabarraga\, I. & Adhikari\, R. (2011)\, Lattice-Boltzmann-Langevin simulations of binary mixtures\, Physical Review E 84(4)\, 046709. \n[32] Kendon\, V. M.; Cates\, M. E.; Pagonabarraga\, I.; Desplat\, J.-C. & Bladon\, P. (2001)\, Inertial effects in three-dimensional spinodal decomposition of a symmetric binary fluid mixture: a lattice Boltzmann study\, Journal of Fluid Mechanics 440\, 147–203. \n[33] Stratford\, K.; Adhikari\, R.; Pagonabarraga\, I.; Desplat\, J.-C. & Cates\, M. E. (2005)\, Colloidal Jamming at Interfaces: A Route to Fluid-Bicontinuous Gels\, Science 309(5744)\, 2198–2201. \n[34] Jansen\, F. & Harting\, J. (2011)\, From bijels to Pickering emulsions: A lattice Boltzmann study\, Physical Review E 83(4)\, 046707. \n[35] Michele\, L. D.; Fiocco\, D.; Varrato\, F.; Sastry\, S.; Eiser\, E. & Foffi\, G. (2014)\, Aggregation dynamics\, structure\, and mechanical properties of bigels\, Soft Matter 10(20)\, 3633–3648. \n[36] Asmolov\, E. S.; Schmieschek\, S.; Harting\, J. & Vinogradova\, O. I. (2013)\, Flow past superhydrophobic surfaces with cosine variation in local slip length\, Physical Review E 87(2)\, 023005. \n[37] Cates\, M. E.; Desplat\, J.-C.; Stansell\, P.; Wagner\, A. J.; Stratford\, K.; Adhikari\, R. & Pagonabarraga\, I. (2005)\, Physical and computational scaling issues in lattice Boltzmann simulations of binary fluid mixtures\, Philosophical Transactions of the Royal Society of London A: Mathematical\, Physical and Engineering Sciences 363(1833)\, 1917–1935. \n[38] Schmieschek\, S.; Shamardin\, L.; Frijters\, S.; Krueger\, T.; Schiller\, U. D.; Harting\, J. & Coveney\, P. V. (2017)\, LB3D: A parallel implementation of the Lattice-Boltzmann method for simulation of interacting amphiphilic fluids\, Computer Physics Communications 217\, 149–161. \n[39] Zahedi\, S.; Kronbichler\, M. & Kreiss\, G. (2011)\, Spurious currents in finite element based level set methods for two-phase flow\, International Journal for Numerical Methods in Fluids 69(9)\, 1433–1456. \n[40] Shan\, X. (2006)\, Analysis and reduction of the spurious current in a class of multiphase lattice Boltzmann models\, Physical Review E 73(4)\, 047701. \n[41] Lee\, T. & Fischer\, P. F. (2006)\, Eliminating parasitic currents in the lattice Boltzmann equation method for nonideal gases\, Physical Review E 74(4)\, 046709. \n[42] Pooley\, C. M. & Furtado\, K. (2008)\, Eliminating spurious velocities in the free-energy lattice Boltzmann method\, Physical Review E 77(4)\, 046702.
URL:https://www.e-cam2020.eu/legacy_event/state-of-the-art-workshop-challenges-in-multiphase-flows/
LOCATION:Monash University Prato Center\, Prato\, Tuscany\, Italy
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20191125
DTEND;VALUE=DATE:20191130
DTSTAMP:20260430T215138
CREATED:20190108T153007Z
LAST-MODIFIED:20190301T112310Z
UID:3606-1574640000-1575071999@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Mesoscopic simulation models and High-Performance Computing
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button] \nIf you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nIn Discrete Element Methods the equation of motion of large number of particles is numerically integrated to obtain the trajectory of each particle [1]. The collective movement of the particles very often provides the system with unpredictable complex dynamics inaccessible via any mean field approach. Such phenomenology is present for instance in a seemingly simple systems such as the hopper/silo\, where intermittent flow accompanied with random clogging occurs [2]. With the development of computing power alongside that of the numerical algorithms it has become possible to simulate such scenarios involving the trajectories of millions of spherical particles for a limited simulation time. Incorporating more complex particle shapes [3] or the influence of the interstitial medium [4] rapidly decrease the accessible range of the number of particles. \nAnother class of computer simulations having a huge popularity among the science and engineering community is the Computational Fluid Dynamics (CFD). A tractable method for performing such simulations is the family of Lattice Boltzmann Methods (LBMs) [5]. There\, instead of directly solving the strongly non-linear Navier-Stokes equations\, the discrete Boltzmann equation is solved to simulate the flow of Newtonian or non-Newtonian fluids with the appropriate collision models [6\,7]. The method resembles a lot the DEMs as it simulates the the streaming and collision processes across a limited number of intrinsic particles\, which evince viscous flow applicable across the greater mass. \nAs both of the methods have gained popularity in solving engineering problems\, and scientists have become more aware of finite size effects\, the size and time requirements to simulate practically relevant systems using these methods have escaped beyond the capabilities of even the most modern CPUs [8\,9]. Massive parallelization is thus becoming a necessity. This is naturally offered by graphics processing units (GPUs) making them an attractive alternative for running these simulations\, which consist of a large number of relatively simple mathematical operations readily implemented in a GPU [8\,9]. \n  \nReferences\n[1] P.A. Cundall and O.D.L. Strack\, Geotechnique 29\, 47–65 (1979).\n[2] H. G. Sheldon and D. J. Durian\, Granular Matter 6\, 579-585 (2010).\n[3] A. Khazeni\, Z. Mansourpour Powder Tech. 332\, 265-278 (2018).\n[4] J. Koivisto\, M. Korhonen\, M. J. Alava\, C. P. Ortiz\, D. J. Durian\, A. Puisto\, Soft Matter 13 7657-7664 (2017).\n[5] S. Succi\,The lattice Boltzmann equation: for fluid dynamics and beyond. Oxford university press\, (2001).\n[6] L. S. Luo\, W. Liao\, X. Chen\, Y. Peng\, W. Zhang\, Phys. Rev. E\, 83\, 056710 (2011).\n[7] S. Gabbanelli\, G.Drazer\, J. Koplik\, Phys. Rev. E\, 72\, 046312 (2005).\n[8] N Govender\, R. K. Rajamani\, S. Kok\, D. N. Wilke\, Minerals Engin. 79\, 152-168 (2015).\n[9] P.R. Rinaldi\, E. A. Dari\, M. J. Vénere\, A. Clausse\, Simulation Modelling Practice and Theory\, 25\, 163-171 (2012).
URL:https://www.e-cam2020.eu/legacy_event/esdw-mesoscopic-simulation-and-hpc/
LOCATION:CECAM-FI\, Aalto University\, Finland
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20191101
DTEND;VALUE=DATE:20191110
DTSTAMP:20260430T215138
CREATED:20190111T221620Z
LAST-MODIFIED:20191008T094252Z
UID:3682-1572566400-1573343999@www.e-cam2020.eu
SUMMARY:Inverse Molecular Design & Inference: building a Molecular Foundry
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button] \nIf you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nThe overarching theme of this proposed E-CAM Transverse Extended Software Development Workshop is the design and control of molecular machines including sensors\, enzymes\, therapeutics\, and transporters built as fusion proteins or nanocarrier-protein complexes\, and in particular\, the software development and interfacing that this entails. Several immuno-diagnostic companies and molecular biology experimental groups have expressed a strong interest in the projects at the core of this proposal. The proposed ESDW is transverse as it entails the use of methodologies from two E-CAM Scientific Workpackages: WP1 (Advanced MD/rare-events methods) and WP4 (Mesoscale/Multiscale simulation). \nFusion proteins are sets of two or more protein modules linked together where the underlying genetic codes of each module and the fusion protein itself are known or can be easily inferred. The fusion protein typically retains the functions of its components\, and in some cases gains additional functions. They occur in nature\, but also can be made artificially using genetic engineering and biotechnology- and used for a wide variety of settings ranging from unimolecular FRET sensors\, novel immuno-based cancer drugs\, enzymes [1\,2] and energy conversion (for example efficient generation of alcohol from cellulose) [3\,4]. Fusion proteins can be expressed using genetic engineering in cell lines\, and purified for in-vitro use using biotechnology. Much of the design work is focused on how different modules are optimally linked or fused together via suitable peptides\, rather than on internal changes of modules. Optimizing such designs experimentally can be done through for example random mutations\, but a more controlled approach based on underlying molecular mechanisms is desirable\, for which a pragmatic multiscale approach is ideally suited combining bioinformatics and homology\, coarse-graining\, detailed MD and rare-event based methods\, and machine learning. The figure on the front of this proposal is a representative example of a fusion protein sensor designed to bind to a specific RNA nucleic acid sub-sequence\, which causes an optimized hinge-like protein to close and in the process bring two fluorescence proteins together allowing the binding event to be observed optically through FRET microscopy. \nNanocarriers (NC) are promising tools for cancer immunotherapy and other diagnostic and therapeutic applications. NCs can be decorated on their surface with molecules that facilitate target-specific antigen delivery to certain antigen-presenting cell types or tumor cells. However\, the target cell-specific uptake of nano-vaccines is highly dependent on the modifications of the NC itself. One of these is the formation of a protein corona [5] around NC after in vivo administration. Appropriate targeting of NC can be affected by unintended interactions of the NC surface with components of blood plasma and/or with cell surface structures that are unrelated to the specific targeting structure. The protein corona around NC may affect their organ-specific or cell type-specific trafficking as well as endocytosis and/or functional properties of the NC. Most importantly\, the protein corona has been shown to interfere with targeting moieties used to induce receptor-mediated uptake of the NC\, both inhibiting and enhancing internalization by specific cell types [5]. Moreover\, the protein corona is taken up by the target cell\, which may alter their function. Therefore\, tailoring the surface properties of the NC to facilitate the adsorption of specific proteins and control the structure of the corona can help to significantly improve their performance. Modification of surface properties\, e.g. via grafting olygomers\, is also known to affect the preferred orientation of adsorbed proteins and\, therefore\, their functionality [6]. The molecular design would include the selection of appropriate NC coating and the type of antibody to optimize the NC uptake. \nMesoscale simulation is required to understand the thermodynamics and kinetics of protein adsorption on the NCs with engineered surfaces [7] and to achieve the desired structure with preferred adsorption of the selected antigen. However\, the aforementioned issues often require biological and chemical accuracy that typical mesoscale models cannot achieve unless buttressed by accurate simulations at an atomistic/molecular level\, rare-event methods and machine learning. \nA pragmatic approach towards the enhancement of fusion proteins and NC’s is as follows. \n(i) Molecular designs are initially developed and optimized as simple CG models and include the use of information theory and machine learning. \n(ii) The solution of the inverse problem of building the fusion protein or the NC-protein complex to match the design requires a multiscale approach combining mesoscale modeling\, molecular dynamics\, rare-event methods\, machine learning\, homology\, mutation\, solvent conditions. \n(iii) Iterate steps (i) and (ii) to optimize the design\, and in the process collect data for machine learning driven design. \n(iv) Final validation using detailed MD\, rare-event methods and HPC \nThe ESDW we plan will over the course of two 5 day meetings with several intervening months produce multiple software modules including the following.\n(a) C/C++/Modern Fortran or python based codes to build and optimize simple CG models of fusion proteins or NC-protein complexes using information theory and machine learning. \n(b) Semi-automated pipelines to solve the inverse problem of building the fusion protein or the NC to match the design. This will involve interfacing with md/ mesoscale engines such as LAMMPS\, Gromacs\, OPENMM\, EXpresso\, rare-event based methods such as PLUMED\, and bioinformatics code such as I-TASSER\, INTFOLD. \n(c) Particle insertion/deletion methods for alchemistry – mutation of amino acids\, changes in the solvent and associated changes in free energy properties. \n(d) Codes to add corrections to coarse-grained models (bead models/martini) using detailed atomistic data (e.g. potential of mean force for key order parameters\, structure factors etc) or experimental data where available. \nWhile this is an ambitious plan\, it is worth pointing out that a similar integrated approach to protein development was already made by the lab of John Chodera [8]. While it did not include the focus on fusion proteins or NC-protein complexes or incorporate systematically coarse-graining\, it demonstrates both the feasibility of what we propose here and how to achieve practical solutions. Other ideas of a systematic approach to molecular design using MD simulation have been also proposed recently [9\,10]. \n  \nReferences\n[1] H. Yang et al\, The promises and challenges of fusion constructs in protein biochemistry and enzymology\, Appl Microbiol Biotechnol (2016)\n[2] Bochicchio\, Anna et al\, Designing the Sniper: Improving Targeted Human Cytolytic Fusion Proteins for Anti-Cancer Therapy via Molecular Simulation\, Biomedicines\, 5(1)\,9 (2017)\n[3] Y. Fujita et al\, Direct and Efficient Production of Ethanol from Cellulosic Material with a Yeast Strain Displaying Cellulolytic Enzymes\, Appl Environ Microbiol. 68(10): 5136–5141 (2002)\n[4] M. Gunnoo et al\, Nanoscale Engineering of Designer Cellulosomes\, dv Mater. 28(27):5619-4 (2016)\n[5] M. Bros et al. The Protein Corona as a Confounding Variable of Nanoparticle-Mediated Targeted Vaccine Delivery\, Front. Immunol. 9\, 1760 (2018).\n[6] I. Lieberwirth et al. The Role of the Protein Corona in the Uptake Process of Nanoparticles\, 24\, Supplement S1\, Proceedings of Microscopy & Microanalysis (2018)\n[7] H Lopez et al. Multiscale Modelling of Bionano Interface\, Adv. Exp. Med. Biol. 947\, 173-206 (2017)\n[8] DL. Parton et al Ensembler: Enabling High-Throughput Molecular Simulations at the Superfamily Scale. PLoS Comput Biol 12(6): e1004728\, (2016)\n[9] PV. Komarov et al. A new concept for molecular engineering of artificial enzymes: a multiscale simulation\, Soft Matter 12\, 689-704 (2016)\n[10] BA. Thurston et al. Machine learning and molecular design of self-assembling -conjugated oligopeptides\, Mol. Sim. 44\, 930-945 (2018)\n[11] D. Carroll. Genome Engineering with Targetable Nucleases\, Annu. Rev. Biochem. 83:409–39 (2014)
URL:https://www.e-cam2020.eu/legacy_event/inverse-molecular-design-inference-building-a-molecular-foundry/
LOCATION:CECAM-IRL Node\, Kingston Glebe\, Clifden\, Connemara\, Clifden\, H71WY19\, Ireland
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20190708
DTEND;VALUE=DATE:20190720
DTSTAMP:20260430T215138
CREATED:20190227T112113Z
LAST-MODIFIED:20190301T110548Z
UID:3675-1562544000-1563580799@www.e-cam2020.eu
SUMMARY:Extended software development workshop in quantum dynamics
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button] \nIf you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nQuantum molecular dynamics simulations are pivotal to understanding and predicting the microscopic details of molecules\, and strongly rely on a combined theoretical and computational effort. When considering molecular systems\, the complexity of the underlying equations is such that approximations have to be devised\, and the resulting theories need to be translated into algorithms and computer programs for numerical simulations. In the last decades\, the joint effort of theoretical physicists and quantum chemists around the challenges of quantum dynamics made it possible to investigate the quantum dynamics of complex molecular systems\, with applications ranging from energy conversion\, energy storage\, organic electronics\, light-emitting devices\, biofluorescent molecules\, or photocatalysis\, to name a few.\nTwo different strategies have been successfully applied to perform quantum molecular dynamics: wavepacket propagation or trajectories. The first family of methods includes all quantum nuclear effects\, but their computational cost hampers the simulation of systems with moderate number of more than 10-12 degrees of freedom. The method coined multi-configuration time-dependent Hartree (MCTDH) constitutes one of the most successful developments in this field and is often considered as a gold standard for quantum dynamics [1]. Other strategies for wavepacket propagation try to identify procedures to optimize the “space” where the wavefunction information is computed\, such that Cartesian grids can be replaced with Smolyak grids [2]. The second family of methods introduces the idea of trajectories as a way to approximate the nuclear subsystem\, either classically or semiclassically\, and is exemplified by methods like the trajectory surface hopping and Ehrenfest schemes [3]\, or the more accurate methods coupled-trajectory mixed quantum-classical (CT-MQC) [4] and quantum-classical Liouville equation (QCLE) [5].\nFrom a computational perspective\, both families of methods require extensive electronic structure calculations\, as the nuclei move under the effect of the electronic subsystem\, either “statically” occupying its ground state or “dynamically” switching between excited states. Solving the quantum nuclear dynamics equations also becomes in itself very expensive in the case of wavepacket propagation methods. Contrary to other\, more consolidated\, areas of modeling\, quantum dynamics simulations do not benefit from established community packages and most of the progress occurs based on in-house codes\, difficult to maintain and with limits in optimization and portability. One of the core actions of E-CAM has been to seed a change in this situation\, by promoting systematic developments of software\, providing a repository to host and share code\, and fostering collaborations on adding functionalities and improving the performance of common software scaffolds for wavepacket (Quantics) and trajectory-based (PaPIM) packages. Collaborations on developments on other codes have also been initiated. This workshop aims at continuing and extending these activities based on input from the community. \n  \nReferences\n[1] H. D. Meyer\, U. Manthe\, L. S. Cederbaum. Chem. Phys. Lett. 165 (1990) 73.\n[2] D. Lauvergant\, A. Nauts. Spectrochimica Acta Part A 119 (2014) 18.\n[3] J. C. Tully. Faraday Discuss. 110 (1998) 407.\n[4] S. K. Min\, F. Agostini\, I. Tavernelli\, E. K. U. Gross. J. Phys. Chem. Lett. 8 (2017) 3048.\n[5] R. Kapral. Annu. Rev. Phys. Chem. 57 (2006) 129.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-in-quantum-dynamics/
LOCATION:Durham University
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20190617
DTEND;VALUE=DATE:20190622
DTSTAMP:20260430T215138
CREATED:20190110T153512Z
LAST-MODIFIED:20190228T141531Z
UID:3621-1560729600-1561161599@www.e-cam2020.eu
SUMMARY:Recent developments in quantum dynamics\, an E-CAM state-of-the-art workshop
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button]\nIf you are interested in attending this event\, please visit the CECAM website here.\n\nWorkshop Description\nThe proposed workshop will gather a broad community of researchers in the field of quantum dynamics\, who are actively investigating the interplay of electronic and nuclear correlation in problems spanning multiple length and time scales\, and who are seeking to develop and apply state-of-the-art (SOA) methodologies to systems of increasing complexity. \nContinuing in the spirit of the first E-CAM SOA workshop\, held in 2016 in Lausanne\, a broad overview of the field of quantum dynamics will be presented. Current and emergent quantum dynamics methodologies will be critically discussed from their basic assumptions to their most recent extensions\, including their pitfalls and possible improvements\, in the hope that the ideas exchanged will promote exciting new developments. Participants will also be asked to address\, in particular\, aspects related to the software tools that implement the different methods\, evaluating development schemes (community efforts\, in-house coding)\, HPC-readiness (e.g. portability\, scalability\, benchmarking)\, and ease of use. An assessment of the “readiness for experiments and industry” will also be pursued\, identifying new problems of experimental and industrial interest where quantum-dynamical effects are relevant\, presenting success stories\, and – crucially – evaluating critically the gap between available methods and codes and the needs of non-professional users to suggest means to reduce it. \nThe format of the workshop will conform to the Tentative Timetable included in this proposal. This format is based on positive feedback following the CECAM Quantum Dynamics meetings that took place in Paris (2016) and Lausanne (2017). Ample time for discussions is set aside\, in agreement with CECAM and E-CAM recommendations. We will organize the topics into five sessions: \nI. Theoretical Foundations of Quantum Dynamics in Molecular and Condensed Phase Systems\nII. Real-time Path Integral and Quantum Master Equation Techniques\nIII. Trajectory-Based Quantum Molecular Dynamics: Methods and Applications\nIV. Nuclear Quantum Effects\, Path Integral Molecular Dynamics\, and Vibrational Spectroscopy\nV. Numerically Exact Methods \nWe will also invite chairpeople that will be asked to actively encourage exchanges and cross-fertilization in the discussion sessions. Speakers and participants will also be asked to highlight formal and algorithmic connections between different methods and to mention\, or propose sets of benchmarks to assess relative performances. In this SOA workshop\, we have chosen not to allot time for contributed talks. All participants are\, however\, expected to contribute to the discussions and will be given a chance to present their work at the poster session or\, informally\, as has become customary in the CECAM environment\, during the long coffee breaks. \nThe connection to E-CAM will be highlighted through a special discussion session (VI: Software development in Quantum Dynamics) and presentation of the most recent software modules developed during the extended software development workshops\, which runs in parallel to this workshop series. Experts from E-CAM and from other experiences of systematic software development in the area (e.g. MolSSI\, GPU based codes\, i-PI) will discuss their experience with the goal to share good practices\, identify new synergies\, provide all participants with an opportunity to know and contribute (if interested) to community based codes or to initiate new coordinated activities in the area. \n  \nReferences\n[1] D. Schapers\, B. Zhao\, U. Manthe\, Chemical Physics 509\, 37-44\, (2018).\n[2] Robert Wodraszka\, Tucker Carrington\, J. Chem. Phys. 148\, 044115\, (2018).\n[3] D. E. Makarov\, and N. Makri\, Chem. Phys. Lett. 221\, 482 (1994).\n[4] N. Makri\, and D. E. Makarov\, J. Chem. Phys. 102\, 4600 (1994).\n[5] L.Muhlbacher\,andE.Rabani\,Phys.Rev.Lett.\, 100\,176403 (2008).\n[6] G. Cohen and E. Rabani\, Phys. Rev. B 84\, 075150 (2011).\n[7] Y. Tanimura and R. Kubo\, J. Phys. Soc. Jpn. 58\, 101-114 (1989); Y. Tanimura\, J. Chem. Phys.\, 141\, 044114 (2014).\n[8] Abedi\, A.\, Maitra\, N. T.\, and Gross\, E. K. U.\, Phys. Rev. Lett.\, 105\, 123002 (2010).\n[9] H. D. Meyer\, U. Manthe\, and L. S. Cederbaum\, Chem. Phys. Lett. 165\, 73 (1990); I. Burghardt\, H.-D. Meyer\, and L. S. Cederbaum\, J. Chem. Phys. 111\, 2927 (1999); H. Wang and M. Thoss\, ibid. 119\, 2003 (2003).\n[10] I. Burghardt\, K. Giri\, and G. A. Worth\, J. Chem. Phys. 129\, 174104 (2008).\n[11] G. A. Worth. and I. Burghardt\, Chem. Phys. Lett. 368\, 502 (2003).\n[12] G. Albareda\, H. Appel\, I. Franco\, A. Abedi\, Angel Rubio\, Phys. Rev. Lett.\,113\, 083003\, (2014).\n[13] R. Kapral and G. Ciccotti\, J. Chem. Phys. 110\, 8919 (1999); R. Kapral\, Annu. Rev. Phys. Chem. 57\, 129 (2006).\n[14] S. Bonella and D. F. Coker\, J. Chem. Phys. 122\, 194102 (2005)\, P. Huo and D. F. Coker\, J. Chem. Phys. 133\, 184108 (2011)\, P. Huo and D. F. Coker\, ibid. 137\, 22A535 (2012).\n[15] S. K. Min\, F. Agostini\, and E. K. U. Gross\, Phys. Rev. Lett. 115\, 073001 (2015).\n[16] J. Beutier\, D. Borgis\, R. Vuilleumier\, and S. Bonella\, J. Chem. Phys. 141\, 084102 (2014).\n[17] M. Ben-Nun and T. J. Mart ́ınez\, J. Chem. Phys. 108\, 7244 (1998); M. Ben-Nun\, J. Quenneville\, and T. J. Mart ́ınez\, J. Phys.Chem. A 104\, 5161 (2000).\n[18] Tully\, J.\, Faraday Discussions\, 110\, 407-419 (1998).\n[19] TE Markland\, M Ceriotti – Nature Reviews Chemistry\, 2018.\n[20] Dammak\, H.; Chalopin\, Y.; Laroche\, M.; Hayoun\, M.; Greffet\, J. J. Phys. Rev. Lett. 2009\, 103\, 19060.\n[21] Ceriotti\, M.; Bussi\, G.; Parrinello\, M. Phys. Rev. Lett. 2009\, 103\, 030603.\n[22] Javier Hernández-Rojas\, Florent Calvo\, and Eva Gonzalez Noya\, J. Chem. Theo. Comp. 2015 11 (3)\, 861-870.\n[23] N. Ananth\, J. Chem. Phys. 139\, 124102 (2013); J.O. Richardson and M. Thoss ibid.\, 139\, 031102\, 2013.\n[24] S. Nakajima\, Prog. Theor. Phys. 20\, 948 (1958).\n[25] R. Zwanzig\, J. Chem. Phys. 33\, 1338 (1960).\n[26] Q. Shi and E. Geva\, J. Chem. Phys. 119\, 12063 (2003).\n[27] M.-L. Zhang\, B. J. Ka and E. Geva\, J. Chem. Phys. 125\, 044106 (2006).\n[28] A. Kelly and T. E. Markland\, J. Chem. Phys. 139\, 014104 (2013).\n[29] E. Y. Wilner\, H. Wang\, M. Thoss\, and E. Rabani\, Phys. Rev. B 90\, 115145 (2014)\n[30] G Albareda\, A Kelly\, A Rubio\, arXiv preprint arXiv:1805.11169 (2018).
URL:https://www.e-cam2020.eu/legacy_event/recent-developments-in-quantum-dynamics-an-e-cam-state-of-the-art-workshop/
LOCATION:CECAM-FR-RA\, Centre Blaise Pascal\, Lyon\, Lyon and Grenoble\, France
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20190612
DTEND;VALUE=DATE:20190615
DTSTAMP:20260430T215138
CREATED:20190108T155054Z
LAST-MODIFIED:20190301T104650Z
UID:3613-1560297600-1560556799@www.e-cam2020.eu
SUMMARY:Electrochemical energy storage: Theory meets industry
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button]\nIf you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\n1/ Introduction and motivation \nHow much energy can a device store? How fast can it be charged? These two questions are at the heart of the research on electrochemical energy storage (EES). Two main families of devices coexist: supercapacitors which accumulate the charge at the surface of the electrodes through ion adsorption [1\,2]\, and batteries in which the storage mechanism is based on redox reactions occurring in the bulk electrodes [3]. Li-ion batteries have a high specific energy\, keeping cellular phones\, laptop and even cars working throughout several hours. For rapid power delivery and recharging\, i.e. for high specific power applications\, supercapacitors are then used. \nDue to the recent advances in the field of materials science\, the range of applications of EES devices has tremendously increased over the past two decades. The development of systems with improved performances and lower costs\, as well as their large-scale production are now considered as vital issues for many countries. This can be seen from the recent creation of networks or institutes that gather academics and industrials\, both at the national and European levels [4-6]. \nMost of the recent breakthroughs have however implied complex materials\, often at the nanoscale. It is thus necessary to control the chemistry at the molecular level in all the active components of the devices\, i.e. the two electrodes\, the electrolytes. The various interfaces also have to be characterized and understood which implies considering potential dependent mechanistic approaches. Over the year\, atomistic and molecular simulations have therefore appeared as one of the main keys to success in designing tomorrow’s high-energy and high-power EES devices\, in complement with in situ and/or in operando spectroscopy techniques [7\,8]. This is now well established in academic laboratories\, which are now routinely building consortiums with synthesis\, electrochemical and spectroscopic characterizations\, together with modeling for developing new materials. However\, this habit does not seem to be adopted yet by the industrial companies in the field. The objective of this workshop is therefore to bring together some of the worldwide experts in the field of EES simulations (and in particular the researchers who are developing the corresponding simulation tools) with the interested industrial partners. We hope that such a workshop could help bridging the gap between needs and supply\, which would put simulation at the centre of the future industrial developments of EES devices. \n2/ State-of-the-art \nThe state-of-the-art can be considered at two levels: 1/ Simulation methods which are routinely used to simulate EES devices. 2/ Initiatives which are currently undertaken to bring simulation tools and/or results within the reach of non-specialist users. \nFrom the methodological point of view\, many different methods are used or developed depending on the nature of the material\, the targeted properties and the necessary time/length scales.\n-The workhorse for studying the redox activity of bulk electrode materials is standard Density Functional Theory (DFT) since it is necessary to have access to the electronic structure.\n-For electrolytes\, determining the transport properties involves the use of molecular dynamics. Depending on the availability of correct force fields\, classical or DFT-based MD are generally used [7].\n-Then further statistics or larger systems are generally studied by using lattice-based methods\, such as kinetic Monte Carlo or Lattice Boltzmann. \nGenerally\, standard DFT or MD packages can be used to study bulk materials. However in the case of interfaces\, additional difficulties need to be overcome so that several groups are developing specific methodologies and/or simulation packages [9-11]. \nDespite the large growth in the simulation communities (especially DFT and MD) over the past decades\, using these tools often requires lots of efforts for experimentalists and/or engineers in the industry. For this reason\, several groups are currently developing user-friendly interfaces\, either in specific programs or directly accessible from website [12]. For efficiency reasons\, it is necessary to develop high-throughput frameworks and to link these tools with accurate databases [13\,14]. This implies that a common language is established between the communities of theorists and experimentalists\, in order to build appropriate databases that will be helpful for material designers. \nFinally\, we should mention that several research groups are developing tools that aim to simulate systems at much larger scales [15\,16]. The objective is to provide a direct link with experiments\, by directly computing macroscale properties similar to the ones obtained in electrochemistry experiments. Such multi-scale methods\, most often based on the Butler-Volmer equation\, are typically top-down approaches that aim to account for the material or electrolyte specificity in an effective manner through appropriate parameterizations. Huge efforts are being devoted to the development of bottom-up approaches\, with however major issues due to transferability between different scales. \n  \nReferences\n[1] Simon\, P. and Gogotsi\, Y. Materials for electrochemical capacitors. Nature Mater.\, 7\, 845 (2008).\n[2] Béguin\, F.\, Presser\, V.\, Balducci\, A. and Frackowiak\, E. Carbons and electrolytes for advanced supercapacitors. Adv. Mater.\, 26\, 2219 (2014).\n[3] Armand\, M. and Tarascon\, J.-M. Building better batteries. Nature\, 451\, 652 (2008).\n[4] RS2E\, French network on electrochemical energy storage\, http://www.energie-rs2e.com/fr\n[5] ALISTORE\, European Research Institute\, http://www.alistore.eu/presentation\n[6] The Faraday Institution\, UK’s research institute for electrochemical energy storage\, https://faraday.ac.uk/\n[7] Cheng\, L. et al. Accelerating electrolyte discovery for energy storage with high-throughput screening. J. Phys. Chem. Lett.\, 6\, 283 (2015).\n[8] Salanne\, M. et al. Efficient storage mechanisms for building better supercapacitors. Nature Energy\, 1\, 16070 (2016).\n[9] https://github.com/bjmorgan/lattice_mc\n[10] Dalverny\, A.-L.\, Filhol\, J.-S. and Doublet\, M.-L. Interface electrochemistry in conversion materials for Li-ion batteries\, J. Mater. Chem.\, 21\, 10134 (2011).\n[11] Merlet\, C.\, et al. Simulating supercapacitors: can we model electrodes as constant charge surfaces?\, J. Phys. Chem. Lett.\, 4\, 264 (2013).\n[12] The Materials Project\, https://materialsproject.org/press\n[13] Jain\, A. et al. A high-throughput infrastructure for density functional theory calculations\, Comput. Mater. Sci.\, 50\, 2295 (2011).\n[14] Curtarolo\, S. et al. The high-throughput highway to computational materials design\, Nature Mater.\, 12\, 191 (2013).\n[15] MS-LiberT simulation package\, http://modeling-electrochemistry.com/ms-liber-t/\n[16] Farkondeh\, M.\, Pritzker\, M.\, Fowler\, M. and Delacourt\, C. Mesoscopic modeling of a LiFePO4 electrode: experimental validation under continuous and intermittent operating conditions
URL:https://www.e-cam2020.eu/legacy_event/electrochemical-energy-storage-theory-meets-industry/
LOCATION:CECAM-FR-MOSER\, Maison de la Simulation\, Saclay\, France
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20190403
DTEND;VALUE=DATE:20190413
DTSTAMP:20260430T215138
CREATED:20190111T215832Z
LAST-MODIFIED:20190301T123955Z
UID:3664-1554249600-1555113599@www.e-cam2020.eu
SUMMARY:ESDW: Topics in Classical MD
DESCRIPTION:[button url=”https://www.e-cam2020.eu/calendar/” target=”_self” color=”primary”]Back to Calendar[/button]\nIf you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nClassical molecular dynamics (MD) is a broad field\, with many domains of expertise. Those specialist domains include topics like transition path sampling (which harvests many examples of a process in order to study it at a statistical level [1])\, metadynamics (which runs a trajectory with modified dynamics that enhance sampling\, and from which free energy profiles can be constructed [2])\, as well as various topics focused on the underlying dynamics\, either by providing better representations of the interactions between atoms (e.g.\, force fields [3] or neural network potentials [4]) or by changing the way the dynamics are performed (e.g.\, integrators [5]). \nFrequently\, experts in one domain are not experienced with the software of other domains. This workshop aims to combine both depth\, by extending domain-specific software\, and breadth\, by providing participants an opportunity to learn about software from other domains. As an extended software development workshop (ESDW)\, a key component of the workshop will be the development of modules that extend existing software packages. Ideally\, some modules may connect multiple domain-specific packages. \nTopics at this workshop will include using and extending modern MD software in the domains of: \n* advanced path sampling methods (and the software package OpenPathSampling)\n* metadynamics and the calculation of collective variables (and the software package PLUMED)\n* machine learning for molecular dynamics simulatons (including local structure recognition and representation of potential energy surfaces) \nIn addition\, this workshop will feature an emphasis on performance testing and benchmarking software\, with particular focus on high performance computing. This subject is relevant to all specialist domains. \nBy combining introductions to software from different specialist fields with an opportunity to extend domain-specific software\, this workshop is intended to provide opportunities for cross-pollination between domains that often develop independently. \nReferences\n[1] Bolhuis\, P.G. and Dellago\, C. Trajectory‐Based Rare Event Simulations. Reviews in Computational Chemistry\, 27\, p. 111 (2010).\n[2] A. Laio and F.L. Gervasio. Rep. Prog. Phys. 71\, 126601 (2008).\n[3] J.A. Maier\, C. Martinez\, K. Kasavajhala\, L. Wickstrom\, K.E. Hauser\, and C. Simmerling. J. Chem. Theory. Comput. 11\, 3696 (2015).\n[4] T. Morawietz\, A. Singraber\, C. Dellago\, and J. Behler. Proc. Natl. Acad. Sci USA\, 113\, 8368 (2016).\n[5] B. Leimkuhler and C. Matthews. Appl. Math. Res. Express\, 2013\, 34 (2013).
URL:https://www.e-cam2020.eu/legacy_event/esdw-topics-in-classical-md/
LOCATION:CECAM-FR-RA\, Lyon and Grenoble\, France
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20190107
DTEND;VALUE=DATE:20190119
DTSTAMP:20260430T215138
CREATED:20180109T160700Z
LAST-MODIFIED:20190110T153937Z
UID:2301-1546819200-1547855999@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Scaling Electronic Structure Applications
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nThe evolutionary pressure on electronic structure software development is greatly increasing\, due to the emergence of new paradigms\, new kinds of users\, new processes\, and new tools. The large feature-full codes that were once developed within one field are now undergoing a heavy restructuring to reach much broader communities\, including companies and non-scientific users[1]. More and more use cases and workflows are performed by highly-automated frameworks instead of humans: high-throughput calculations and computational materials design[2]\, large data repositories[3]\, and multiscale/multi-paradigm modeling[4]\, for instance. At the same time\, High-Performance Computing Centers are paving the way to exascale\, with a cascade of effects on how to operate\, from computer architectures[5] to application design[6]. The disruptive paradigm of quantum computing is also putting a big question mark on the relevance of all the ongoing efforts[7]. \nAll these trends are highly challenging for the electronic structure community. Computer architectures have become rapidly moving targets\, forcing a global paradigm shift[8]. As a result\, long-ignored and well-established software good practices that were summarised in the Agile Manifesto[9] nearly 20 years ago are now adopted at an accelerating pace by more and more software projects[10]. With time\, this kind of migration is becoming a question of survival\, the key for a successful transformation being to allow and preserve an enhanced collaboration between the increasing number of disciplines involved. Significant efforts of integration from code developers are also necessary\, since both hardware and software paradigms have to change at once[11]. \nTwo major issues are also coming from the community itself. Hybrid developer profiles\, with people fluent both in computational and scientific matters\, are still difficult to find and retain. On the long run\, the numerous ongoing training initiatives will gradually improve the situation\, while on the short run\, the issue is becoming more salient and painful\, because the context evolves faster than ever. Good practices have usually been the first element sacrificed in the “publish or perish” race. New features have usually been bound to the duration of a post-doc contract and been left undocumented and poorly tested\, favoring the unsustainable “reinventing the wheel” syndrome. \nAddressing these issues requires coordinated efforts at multiple levels:\n– from a methodological perspective\, mainly through the creation of open standards and the use of co-design\, both for programming and for data[12];\n– regarding documentation\, with a significant leap in content policies\, helped by tools like Doxygen and Sphinx\, as well as publication platforms like ReadTheDocs[13];\n– for testing\, by introducing test-driven development concepts and systematically publishing test suites together with software[14];\n– considering deployment\, by creating synergies with popular software distribution systems[15];\n– socially\, by disseminating the relevant knowledge and training the community\, through the release of demonstrators and giving all stakeholders the opportunity to meet regularly[16]. \nThis is what the Electronic Structure Library (ESL)[17] has been doing since 2014\, with a wiki\, a data-exchange standard\, refactoring code of global interest into integrated modules\, and regularly organising workshops\, within a wider movement lead by the European eXtreme Data and Computing Initiative (EXDCI)[18]. \nSince 2014\, the Electronic Structure Library has been steadily growing and developing to cover most fundamental tasks required by electronic structure codes. In February 2018 an extended software development workshop will be held at CECAM-HQ with the purpose of building demonstrator codes providing powerful\, non-trivial examples of how the ESL libraries can be used. These demonstrators will also provide a platform to test the performance and usability of the libraries in an environment as close as possible to real-life situations. This marks a milestone and enables the next step in the ESL development: going from a collection of libraries with a clear set of features and stable interfaces to a bundle of highly efficient\, scalable and integrated implementations of those libraries. \nMany libraries developed within the ESL perform low-level tasks or very specific steps of more complex algorithms and are not capable\, by themselves\, to reach exascale performances. Nevertheless\, if they are to be used as efficient components of exascale codes\, they must provide some level of parallelism and be as efficient as possible in a wide variety of architectures. During this workshop\, we propose to perform advanced performance and scalability profiling of the ESL libraries. With that knowledge in hand it will be possible to select and implement the best strategies for parallelizing and optimizing the libraries. Assistance from HPC experts will be essential and is an unique opportunity to foster collaborations with other Centres of Excellence\, like PoP (https://pop-coe.eu/) and MaX (http://www.max-centre.eu/). \nBased on the successful experience of the previous ESL workshops\, we propose to divide the workshop in two parts. The first two days will be dedicated to initial discussions between the participants and other invited stakeholders\, and to presentations on state-of-the art methodological and software developments\, performance analysis and scalability of applications. The remainder of the workshop will consist in a 12 days coding effort by a smaller team of experienced developers. Both the discussion and software development will take advantage of the ESL infrastructure (wiki\, gitlab\, etc) that was set up during the previous ESL workshops. \n[1] See http://www.nanogune.eu/es/projects/spanish-initiative-electronic-simulations-thousands-atoms-codigo-abierto-con-garantia-y and\n[2] See http://pymatgen.org/ and http://www.aiida.net/ for example.\n[3] http://nomad-repository.eu/\n[4] https://abidev2017.abinit.org/images/talks/abidev2017_Ghosez.pdf\n[5] http://www.deep-project.eu/\n[6] https://code.grnet.gr/projects/prace-npt/wiki/StarSs\n[7] https://www.newscientist.com/article/2138373-google-on-track-for-quantum-computer-breakthrough-by-end-of-2017/\n[8] https://arxiv.org/pdf/1405.4464.pdf (sustainable software engineering)\n[9] http://agilemanifesto.org/\n[10] Several long-running projects routinely use modern bug trackers and continuous integration\, e.g.: http://gitlab.abinit.org/\, https://gitlab.com/octopus-code/octopus\, http://qe-forge.org/\, https://launchpad.net/siesta\n[11] Transition of HPC Towards Exascale Computing\, Volume 24 of Advances in Parallel Computing\, E.H. D’Hollander\, IOS Press\, 2013\, ISBN: 9781614993247\n[12] See https://en.wikipedia.org/wiki/Open_standard and https://en.wikipedia.org/wiki/Participatory_design\n[13] See http://www.doxygen.org/\, http://www.sphinx-doc.org/\, and http://readthedocs.org/\n[14] See https://en.wikipedia.org/wiki/Test-driven_development and http://agiledata.org/essays/tdd.html\n[15] See e.g. http://www.etp4hpc.eu/en/esds.html\n[16] See e.g. https://easybuilders.github.io/easybuild/\, https://github.com/LLNL/spack\, https://github.com/snapcore/snapcraft\, and https://www.macports.org/ports.php?by=category&substr=science\n[17] http://esl.cecam.org/\n[18] https://exdci.eu/newsroom/press-releases/exdci-towards-common-hpc-strategy-europe
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-scaling-electronic-structure-applications/
LOCATION:CECAM-IRL Node\, Kingston Glebe\, Clifden\, Connemara\, Clifden\, H71WY19\, Ireland
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20181001
DTEND;VALUE=DATE:20181004
DTSTAMP:20260430T215138
CREATED:20180109T131436Z
LAST-MODIFIED:20190110T153956Z
UID:2227-1538352000-1538611199@www.e-cam2020.eu
SUMMARY:State-of-the-Art Workshop: Large scale activated event simulations
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here. \nWorkshop Description\nRunning on powerful computers\, large-scale molecular dynamics (MD) simulations are used routinely to simulate systems of millions of atoms providing crucial insights on the atomistic level of a variety of processes of interest in physics\, materials science\, chemistry and biology. For instance\, MD simulations are extensively used to study the dynamics and interactions of proteins\, understand the properties of solutions or investigate transport in and on solids. From a technological point of view\, molecular dynamics simulations play an important role in many fields such as drug development\, the discovery of new materials\, oil extraction or energy production. Indeed\, enormous amounts of data are produced every day by molecular dynamics simulations running on high performance computers around the world and one of the big challenges related to such simulations is to make sense of the data and obtain mechanistic understanding in terms of low-dimensional models that capture the crucial features of the processes under study. Another central challenge is related to the time scale problem often affecting molecular dynamics simulations. More specifically\, despite the exponential increase in computing power witnessed during the last decades and the development of efficient molecular dynamics algorithms\, many processes are characterized by typical time scales that are still far beyond the reach of current computational capabilities. Addressing such time scale problems and developing scientific software able to overcome them is one of the central goals of Work Package 1 (WP1-Classical Molecular Dynamics) of the E-CAM Project. \nThree fundamental problems are intimately tied to the time scale problem of classical molecular dynamics simulation: \n1) The calculation of the populations of metastable states of an equilibrium system. Such populations can be expressed in terms of free energies and hence this problem boils down to the efficient calculation of free energies. \n2) The sampling of transition pathways between long-lived (meta)stable states and the calculation of reaction rate constants. \n3) The extraction of useful mechanistic information from the simulation data and the construction of low-dimensional models that capture the essential features of the process under study. Such models serve as the basis for the definition of reaction coordinates that enable in-depth studies of the process at hand\, e.g. by computing the free energy and kinetics. \nThe central goal of this workshop is to review new algorithmic developments that address the computational challenges mentioned above with a particular emphasis on implications for industrial applications. In particular\, the workshop aims at identifying software modules that should be developed to make efficient and scalable algorithms available to the academic and industrial community. Another goal of the workshop is to identify specific collaboration projects with industrial partners. A dedicated half-day session will be organized specifically for this purpose. To establish the needs of the community and lay out possible directions for development\, we will bring together a diverse group of people including software developers\, users of HPC infrastructure and industrial researchers. \nThe proposed workshop is a follow-up of the first ECAM State-of-the-art Workshop of WP1\, which took place in the summer of 2016 at the Lorentz Center in Leiden\, The Netherlands. At this workshop\, participants reviewed current rare event methods including path sampling\, milestoning\, metadynamics\, Markov state modeling\, diffusion maps\, dimension reduction\, reaction coordinate optimization\, machine learning\, and unsupervised cluster methods\, and explored ways to improve these methods. Particular attention was devoted to the integration of popular MD packages such as Gromacs\, NAMD\, Charmm\, Amber\, ACEMD\, MOIL\, LAMMPS with enhanced analysis and advanced sampling tools including Plumed (a package for enhanced sampling and collective variable analysis)\, pyEmma\, and MSMBuilder (packages for Markov sate model analysis). \nNotwithstanding the great capabilities of existing methods and software\, several challenges remain and will be discussed at the proposed workshop in Vienna: \n– Extracting order parameters from molecular simulations to construct low dimensional models. This point is important because there is no straightforward recipe to reduce the dimensions to meaningful variables and progress in this area is urgently needed. \n– Efficient Methods for sampling rare pathways. Here the goal is to create the molecular trajectory data using advanced sampling algorithms. \n– Machine learning algorithms. Automatic analysis methods may offer new ways to guide simulations and construct reaction coordinates from molecular trajectories. \n– Better ways to integrate simulations and experiments. It is important to connect the proposed computational methods to experimental probes and integrate experimental information into the analysis of computer simulation data. \nMore specifically\, questions that will be addressed at the proposed workshop include: \n1. How to obtain the best low dimension model for the process of interest? \n2. How can we use machine learning to find collective variables and reaction coordinates? \n3. When can reaction coordinates\, which often constitute the slow variables of a process\, be used to coarse-grain the dynamics? When not? \n4. What if multiple transitions are important? Do we resort to kinetic networks or use multiple reaction coordinates? Should one identify a single (possibly complicated) reaction coordinate\, or try to construct a Markov state model (MSM) using many metastable states? \n5. When is it possible to reduce a complex problem to diffusion on a one dimensional free energy landscape\, and when do we need a network Markov model? \n6. How can experiments test reaction coordinate predictions? How do we connect to experiments? \n7. How can extreme-scale computational resources be used efficiently to address these questions? \n8. How can progress in these questions help to address problems of industrial interest?
URL:https://www.e-cam2020.eu/legacy_event/state-of-the-art-workshop-large-scale-activated-event-simulations/
LOCATION:CECAM-AT\, Vienna\, Austria
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20180917
DTEND;VALUE=DATE:20180921
DTSTAMP:20260430T215138
CREATED:20180109T155852Z
LAST-MODIFIED:20190110T154029Z
UID:2291-1537142400-1537487999@www.e-cam2020.eu
SUMMARY:State-of-the-Art Workshop: Improving the accuracy of ab-initio predictions for materials
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nAb-initio simulation methods are the major tool to perform research in condensed matter physics\, materials science\, quantum and molecular chemistry. They can be classified in terms of their accuracy and efficiency\, but typically more accurate means less efficient and vice-versa. The accuracy depends mainly on how accurate one can solve the electronic problem. The most accurate algorithms are the wave-function based methods\, such as Full CI\, Coupled Cluster (CC)\, and Quantum Monte Carlo (QMC) followed by the Density Functional Theory-(DFT)-based methods and finally more approximate methods such as Tight-Binding. Another impor- tant consideration is how the accuracy of a given method scales with the size of the system under consideration. Among the wave-function based methods\, the accuracy of traditional quantum chemistry methods can be sys- tematically improved but their scaling with system size limits their applicability to small molecules. On the other hand\, QMC methods have a much more tractable scaling and have\, in spite of the “fermion sign problem” and the commonly used fixed-node approximation\, because the energies are variational upper bounds\, a way of systematically improving the accuracy. Recently there has been much progress in the use of pseudopotentials and the systematic improvement of nodal surfaces using backflow\, and multiple determinants. [1\, 2\, 3]\nConversely DFT based methods are based on a plethora of different self-consistent mean field approxima- tions\, each one tuned to best represent a class of systems but with limited transferability. Despite progress in developing more general functionals [4\, 5\, 6]\, DFT is missing an “internal” accuracy scale; its accuracy is gen- erally established against more fundamental theories (like CC or QMC) or against experiments. DFT methods are very popular because their favorable scaling with system size\, the same as for QMC\, but with a smaller prefactor.\nIn a number of recent applications [7\, 8] it was found that inclusion of nuclear quantum effects (NQE) worsen considerably the agreement between DFT predictions and experiments. This is ascribed to the inac- curacies of DFT. This illustrates the importance of not using experimental data alone to improve the DFT functional but instead calculations using more fundamental methods. There has been a recent effort to establish the accuracy of DFT approximations by benchmarking with QMC calculations not only for equilibrium geome- tries but also for thermal configurations. This benchmarking can be customized for the individual molecules at a given temperature and pressure and geometry [9\, 10\, 11\, 12].\nAnother important aspect concerns finite size effects in modelling extended systems. Although corrections can be developed for homogenous systems\, for more complex situations with several characteristic length scales one needs to consider systems sizes that cannot be tackled by ab-initio methods. In these applications one needs to use an effective interaction energy. A recent development is the use of Machine Learning (ML) techniques to obtain energy functions with ab-initio accuracy [13\, 14\, 15]. Their transferability and accuracy assessment is still unsolved to some extent but progress is rapid. A related development is to use ML methods to by-passing the Kohn-Sham paradigm of DFT and directly address potential-density map [16\, 17\, 18] \nThe following is a list of topics that will be discussed during the meeting:\n• Benchmarking existing DFT functionals with QMC. DFT has the potential to be accurate\, but the main problem with its predictive power is that its accuracy can be system dependent. QMC was instrumental in developing the first exchange-correlation approximations (e. g. LDA)\, and we envisage that it can play a substantial role to help the discovery and tuning of new functionals. In particular\, the tuning of dispersion interactions appears to be a crucial elements still not fully controlled in modern DFT approximations while it plays a crucial role in many systems like hydrogen and hydrogen based materials such as water.\n• ML approaches with QMC accuracy. Machine Learning (ML) has attracted significant interest recently\, mainly because of its potential to study real life systems\, and also to explore the phase space at a scale that is not available to ab-initio methods. However\, crucial for the ML method is the quality of the training set. It is often possible to train a ML potential on small systems\, where accurate energies and forces can be obtained by quantum chemistry methods. However\, training sets including larger systems are needed. QMC has the potential to provide them especially going forward with exascale computing.\n• opportunity for new exascale applications of QMC to impact simulation for larger systems and longer time scale. QMC is capable of exploiting parallelism very efficiently\, and is probably one of the few methods already capable of running at the exascale level. ML methods on large data set are also inherently parallel and directly usable on exascale machines.\n• We will address the problem of using and testing the force field derived for a small systems to those of a much larger size.\n• We will discuss the use of ML methods to derive new classes of wave functions for QMC calculations of complex systems. \n[1] J. Kolorenc and L. Mitas\, Rep. Prog. Phys. 74\, 1 (2010).\n[2] L. K. Wagner and D. M. Ceperley\, Rep. Prog. Phys. 79\, 094501 (2016).\n[3] M. Taddei\, M. Ruggeri\, S. Moroni\, and M. Holzmann\, Phys. Rev. B 91\, 115106 (2015).\n[4] J. Heyd\, G. Scuseria\, and M. Ernzerhof\, The Journal of Chemical Physics 118\, 8207 (2003).\n[5] K. Lee\, É. Murray\, L. Kong\, B. Lundqvist\, and D. Langreth\, Physical Review B 82\, 81101 (2010).\n[6] K. Berland et al.\, Reports on Progress in Physics 78\, 66501 (2015).\n[7] M. A. Morales\, J. McMahon\, C. Pierleoni\, and D. M. Ceperley\, Physical Review Letters 110\, 65702 (2013).\n[8] M. Rossi\, G. P\, and M. Ceriotti\, Physical Review Letters 117\, 115702 (2016).\n[9] R. C. Clay et al.\, Physical Review B 89\, 184106 (2014).\n[10] M. A. Morales et al.\, Journal of Chemical Theory and Computation 10\, 2355 (2014).\n[11] R. C. Clay\, M. Holzmann\, D. M. Ceperley\, and M. A. Morales\, Physical Review B 93\, 035121 (2016).\n[12] M. J. Gillan\, F. Manby\, M. Towler\, and D. Alfè\, The Journal of Chemical Physics 136\, 244105 (2012).\n[13] K. V. J. Jose\, N. Artrith\, and J. Behler\, Journal of Chemical Physics 136\, 194111 (2012).\n[14] J. Behler\, The Journal of Chemical Physics 145\, 170901 (2016).\n[15] V. Botu\, R. Batra\, J. Chapman\, and R. Ramprasad\, The Journal of Physical Chemistry C 121\, 511 (2016).\n[16] J. C. Snyder\, M. Rupp\, K. Hansen\, K.-R. Mu ̈ller\, and K. Burke\, Physical Review Letters 108\, 253002 (2012).\n[17] L. Li\, T. E. Baker\, S. R. White\, and K. Burke\, Phys. Rev. B 94\, 245129 (2016).\n[18] F. Brockherde et al.\, arXiv:1609.02815v3 (2017).
URL:https://www.e-cam2020.eu/legacy_event/state-of-the-art-workshop-improving-the-accuracy-of-ab-initio-predictions-for-materials/
LOCATION:CECAM-FR-MOSER\, Maison de la Simulation\, Saclay\, France
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20180903
DTEND;VALUE=DATE:20180905
DTSTAMP:20260430T215138
CREATED:20180109T142747Z
LAST-MODIFIED:20190110T154045Z
UID:2251-1535932800-1536105599@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Atomistic\, Meso- and Multiscale Methods on HPC Systems
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here. This a multi-part event and we indicate the date for the first meeting. Dates of follow ups are decided during the first event.\n\nWorkshop Description\nE-CAM is an EINFRA project funded by H2020. Its goal is to create\, develop\, and sustain a European infrastructure for computational science\, applied to simulation and modelling of materials and biological processes that are of industrial and societal interest. E-CAM builds upon the considerable European expertise and capability in this area. \nE-CAM is organized around four scientific areas: Molecular dynamics\, electronic structure\, quantum dynamics and meso- and multiscale modelling\, corresponding to work packages WP1-4. E-CAM gathers a number of groups with complementary expertise in the area of meso- and multiscale modeling and has also very well established contact to simulation code developers. Among the aims of the involved groups in this area is to produce a software stack by combining software modules\, and to further develop existing simulation codes towards highly scalable applications on high performance computer architectures. It has been identified as a key issue that simulation codes in the field of molecular dynamics\, meso- and multiscale applications should be prepared for the upcoming HPC architectures. Different approaches have been proposed by E-CAM WPs: (i) developing and optimizing highly scalable applications\, running a single application on a large number of cores and (ii) developing micro-schedulers for task-farming approaches\, where multiple simulations run each on smaller partitions of a large HPC system and work together on the collection of statistics or the sampling of a parameter space\, for which only loosely coupled simulations would be needed. Both approaches rely on the efficient implementation of simulation codes. \nConcerning strategy\, most of modern parallelized (classical) particle simulation programs are based on a spatial decomposition method as an underlying parallel algorithm. In this case\, different processors administrate different spatial regions of the simulation domain and keep track of those particles that are located in their respective region. Processors exchange information (i) in order to compute interactions between particles located on different processors\, and (ii) to exchange particles that have moved to a region administrated by a different processor. This implies that the workload of a given processor is very much determined by its number of particles\, or\, more precisely\, by the number of interactions that are to be evaluated within its spatial region. \nCertain systems of high physical and practical interest (e.g. condensing fluids) dynamically develop into a state where the distribution of particles becomes spatially inhomogeneous. Unless special care is being taken\, this results in a substantially inhomogeneous distribution of the processors’ workload. Since the work usually has to be synchronized between the processors\, the runtime is determined by the slowest processor (i.e. the one with highest workload). In the extreme case\, this means that a large fraction of the processors is idle during these waiting times. This problem becomes particularly severe if one aims at strong scaling\, where the number of processors is increased at constant problem size: Every processor administrates smaller and smaller regions and therefore inhomogeneities will become more and more pronounced. This will eventually saturate the scalability of a given problem\, already at a processor number that is still so small that communication overhead remains negligible. \nThe solution to this problem is the inclusion of dynamic load balancing techniques. These methods redistribute the workload among the processors\, by lowering the load of the most busy cores and enhancing the load of the most idle ones. Fortunately\, several successful techniques are known already to put this strategy into practice (see references). Nevertheless\, dynamic load balancing that is both efficient and widely applicable implies highly non-trivial coding work. Therefore it has has not yet been implemented in a number of important codes of the E-CAM community\, e.g. DL_Meso\, DL_Poly\, Espresso\, Espresso++\, to name a few. Other codes (e.g. LAMMPS) have implemented somewhat simpler schemes\, which however might turn out to lack sufficient flexibility to accommodate all important cases. Therefore\, the present proposal suggests to organize an Extended Software Development Workshop (ESDW) within E-CAM\, where code developers of CECAM community codes are invited together with E-CAM postdocs\, to work on the implementation of load balancing strategies. The goal of this activity is to increase the scalability of these applications to a larger number of cores on HPC systems\, for spatially inhomogeneous systems\, and thus to reduce the time-to-solution of the applications. \nThe workshop is intended to make a major community effort in the direction of improving European simulation codes in the field of classical atomistic\, mesoscopic and multiscale simulation. Various load balancing techniques will be presented\, discussed and selectively implemented into codes. Sample implementations of load balancing techniques have been done for the codes IMD and MP2C. These are highly scalable particle codes\, cf. e.g. http://www.fz-juelich.de/ias/jsc/EN/Expertise/High-Q-Club/_node.html. The technical task is to provide a domain decomposition with flexible adjustment of domain boarders. The basic load balancing functionality will be implemented and provided by a library\, which will be accessed via interfaces from the codes. \nIn order to attract both developers of the codes as well as postdocs working within E-CAM the workshop will be split into 3 parts: \nPart 1: preparation meeting (2 days)\n– various types of load balancing schemes will be presented conceptually and examples of implemented techniques will be shown\n– code developers / owners will present their codes. Functionalities will be presented and parallel implementations are discussed in view of technical requirements for the implementation of load balancing techniques\n– an interface definition for exchanging information from a simulation code to a load balancing library will be set up \nPart 2: training and implementation (1 week)\n– to enable E-CAM postdocs to actively participate in the development\, some advanced technical courses on MPI and high-performance C++ will be offered in combination with the PRACE PATC course program at Juelich\n– during and after the courses (planned for 2-3 days)\, participants can start implementing a load balancing scheme into a code\n– for those participants who are already on an expert level in HPC techniques\, it is possible to start immediately with implementing load balancing schemes \nPart 3: implementation and benchmarking (1 week)\n– final implementation work with the goal to have at least one working implementation per code\n– for successful implementations benchmarks are conducted on Juelich supercomputer facilities \nThe second part will also be open for a broader community from E-CAM\, so that the workshop can have an impact on the HPC training of postdocs in E-CAM\, which will strengthen their skills and experience in HPC. \nIt is intended that between the face-to-face parts of the workshop\, postdocs and developers continue the preparation and work on the load balancing schemes\, so that the meetings will be an important step to synchronise\, exchange information and experience and improve the current versions of implementation.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-for-atomistic-meso-and-multiscale-methods-on-hpc-systems/
LOCATION:CECAM-DE-JUELICH\, Juelich\, Germany
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20180716
DTEND;VALUE=DATE:20180721
DTSTAMP:20260430T215138
CREATED:20180109T153202Z
LAST-MODIFIED:20190110T154110Z
UID:2275-1531699200-1532131199@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Intelligent high throughput computing for scientific applications
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nHigh throughput computing (HTC) is a computing paradigm focused on the execution of many loosely coupled tasks. It is a useful and general approach to parallelizing (nearly) embarrassingly parallel problems. Distributed computing middleware\, such as Celery [1] or COMP Superscalar (COMPSs) [2]\, can include tools to facilitate HTC\, although there may be challenges extending such approaches to the exascale. \nAcross scientific fields\, HTC is becoming a necessary approach in order to fully utilize next-generation computer hardware. As an example\, consider molecular dynamics: Excellent work over the years has developed software that can simulate a single trajectory very efficiently using massive parallelization [3]. Unfortunately\, for a fixed number of atoms\, the extent of possible parallelization is limited. However\, many methods\, including semiclassical approaches to quantum dynamics [4\,5] and some approaches to rare events [6\,7]\, require running thousands of independent molecular dynamics trajectories. Intelligent HTC\, which can treat each trajectory as a task and manage data dependencies between tasks\, provides a way to run these simulations on hardware up to the exascale\, thus opening the possibility of studying previously intractable systems. \nIn practice\, many scientific programmers are not aware of the range of middleware to facilitate parallel programming. When HTC-like approaches are implemented as part of a scientific software project\, they are often done manually\, or through custom scripts to manage SSH\, or by running separate jobs and manually collating the results. Using the intelligent high-level approaches enabled by distributed computing middleware will simplify and speed up development. \nFurthermore\, middleware frameworks can meet the needs of many different computing infrastructures. For example\, in addition to working within a single job on a cluster\, COMPSs includes support for working through a cluster’s queueing system or working on a distributed grid. Moreover\, architecting a software package such that it can take advantage of one HTC library will make it easy to use other HTC middleware. Having all of these possibilities immediately available will enable developers to quickly create software that can meet the needs of many users. \nThis E-CAM Extended Software Development Workshop (ESDW) will focus on intelligent HTC as a technique that crosses many domains within the molecular simulation community in general and the E-CAM community in particular. Teaching developers how to incorporate middleware for HTC matches E-CAM’s goal of training scientific developers on the use of more sophisticated software development tools and techniques. \nThis E-CAM extended software development workshop (ESDW) will focus on intelligent HTC\, with the primary goals being: \n1. To help scientific developers interface their software with HTC middleware.\n2. To benchmark\, and ideally improve\, the performance of HTC middleware as applications approach extreme scale. \nThis workshop will aim to produce four or more software modules related to intelligent HTC\, and to submit them\, with their documentation\, to the E-CAM software module repository. These will include modules adding HTC support to existing computational chemistry codes\, where the participants will bring the codes they are developing. They may also include modules adding new middleware or adding features to existing middleware that facilitate the use of HTC by the computational chemistry community. This workshop will involve training both in the general topic of designing software to interface with HTC libraries\, and in the details of interfacing with specific middleware packages. \nThe range of use for intelligent HTC in scientific programs is broad. For example\, intelligent HTC can be used to select and run many single-point electronic structure calculations in order to develop approximate potential energy surfaces. Even more examples can be found in the wide range of methods that require many trajectories\, where each trajectory can be treated as a task\, such as: \n* rare events methods\, like transition interface sampling\, weighted ensemble\, committor analysis\, and variants of the Bennett-Chandler reactive flux method\n* semiclassical methods\, including the phase integration method and the semiclassical initial value representation\n* adaptive sampling methods for Markov state model generation\n* approaches such as nested sampling\, which use many short trajectories to estimate partition functions \nThe challenge is that most developers of scientific software are not familiar with the way such packages can simplify their development process\, and the packages that exist may not scale to exascale. This workshop will introduce scientific software developers to useful middleware packages\, improve scaling\, and provide an opportunity for scientific developers to add support for HTC to their codes. \nMajor topics that will be covered include: \n* Concepts of HTC; how to structure code for HTC\n* Accessing computational resources to use HTC\n* Interfacing existing C/C++/Fortran code with Python libraries\n* Specifics of interfacing with Celery/COMPSs\n* Challenges in using existing middleware at extreme scale \n[1] Celery: Distributed Task Queue. http://celeryproject.org\, date accessed 14 August 2017. \n[2] R.M. Badia et al. SoftwareX 3-4\, 32 (2015). \n[3] S. Plimpton. J. Comput. Phys. 117\, 1 (1995). \n[4] W.H. Miller. J. Chem. Phys. 105\, 2942 (2001). \n[5] J. Beutier et al. J. Chem. Phys. 141\, 084102 (2014). \n[6] Du et al. J. Chem. Phys. 108\, 334 (1998). \n[7] G.A. Huber and S. Kim. Biophys. J. 70\, 97 (1996).
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-intelligent-high-throughput-computing-for-scientific-applications/
LOCATION:CECAM-IT-SIMUL\, Polytechnic University of Turin\, Turin\, Italy
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20180618
DTEND;VALUE=DATE:20180630
DTSTAMP:20260430T215138
CREATED:20180109T155113Z
LAST-MODIFIED:20190110T154126Z
UID:2279-1529280000-1530316799@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Quantum Dynamics
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nQuantum molecular dynamics simulations describe the behavior of matter at the microscopic scale and require the combined effort of theory and computation to achieve an accurate and detailed understanding of the motion of electrons and nuclei in molecular systems. Theory provides the fundamental laws governing the dynamics of quantum systems\, i.e.\, the time-dependent Schroedinger equation or the Liouville-von Neumann equation\, whereas numerical techniques offer practical ways of solving those equations for applications. For decades now\, theoretical physicists and quantum chemists have been involved in the development of approximations\, algorithms\, and computer softwares\, that together have enabled for example the investigation of photo-activated processes\, like exciton transfer in photovoltaic compounds\, or of nonequilibrium phenomena\, such as the current-driven Joule heating in molecular electronics. The critical challenge ahead is to beat the exponential growth of the numerical cost with the number of degrees of freedom of the studied problem. In this respect\, a synergy between theoreticians and computer scientists is becoming more and more beneficial as high-performance computing (HPC) facilities are nowadays widely accessible\, and will lead to an optimal exploitation of the computational power available and to the study of molecular systems of increasing complexity. \nFrom a theoretical perspective\, the two main classes of approaches to solving the quantum molecular dynamical problem are wavepacket propagation schemes and trajectory-based (or trajectory-driven) methods. The difference between the two categories lies in the way the nuclear degrees of freedom are treated: either fully quantum mechanically or within the (semi)classical approximation. In the first case\, basis-function contraction techniques have to be introduced to represent the nuclear wavefunction as soon as the problem exceeds 5 or 6 dimensions. Probably the most successful efforts in this direction have been oriented towards the development of the multi-configuration time-dependent Hartree (MCTDH) method [1]. Other strategies are also continuously proposed\, focusing for instance on the identification of procedures to optimize the “space” where the wavefunction information is computed\, e.g.\, by replacing Cartesian grids with Smolyak grids [2]\, and thus effectively reducing the computational cost of the calculation. In the second case\, the nuclear subsystem is approximated classically\, or semiclassically. Although leading to a loss of some information\, this approximation offers the opportunity to access much larger systems for longer time-scales. Various examples of trajectory-based approaches can be mentioned\, ranging from the simplest\, yet very effective\, trajectory surface hopping and Ehrenfest schemes [3]\, to the more involved but also more accurate coupled-trajectory mixed quantum-classical (CTMQC) [4] and quantum-classical Liouville equation (QCLE) [5]. At the interface between wavepacket and trajectory schemes\, Gaussian-MCTDH [6]\, variational multi-configuration Gaussian (vMCG) [7]\, and multiple spawning [8] exploit the support of trajectories to propagate (Gaussian) wavepackets\, thus recovering some of the information lost with a purely classical treatment. In the case of trajectory-based techniques\, the literature provides a significant number of propositions that aim at recovering some of the quantum-mechanical features of the dynamics via appropriately choosing the initial conditions based on the sampling of a Wigner distribution [9]. \nFrom the computational point of view\, a large part of the calculation effort is spent to evaluate electronic properties. In fact\, the nuclei move under the effect of the electronic subsystem\, either “statically” occupying its ground state or “dynamically” switching between excited states. Also\, the nuclear dynamics part of a calculation becomes itself a very costly computational task in the case of wavepacket propagation methods. Therefore\, algorithms for molecular dynamics simulations are not only required to reproduce realistically the behavior of quantum systems in general cases\, but they also have to scale efficiently on parallelized HPC architectures. \nThe extended software development workshop (ESDW) planned for 2018 has three main objectives: (i) build upon the results of ESDW7 of July 2017 to enrich the library of softwares for trajectory-based propagation schemes; (ii) extend the capabilities of the existing modules by including new functionalities\, thus giving access to a broader class of problems that can be tackled; (iii) construct links among the existing and the new modules to transversally connect methods for quantum molecular dynamics\, types of modules (HPC/Interface/Functionality)\, and E-CAM work-packages (WP2 on electronic structure). \nThe central projects of the proposed ESDW\, which are related to the modules that will be provided for the E-CAM library\, are:\n1. Extension of the ModLib library of model Hamiltonians\, especially including high-dimensional models\, which are used to test and compare existing propagation schemes\, but also to benchmark new methods. The library consists of a set of subroutines that can be included in different codes to generate diabatic/adiabatic potential energy surfaces\, and eventually\, diabatic and nonadiabatic couplings\, necessary for both quantum wavepackets methods and trajectory-based methods.\n2. Use of machine-learning techniques to construct analytical forms of potential energy surfaces based on information collected along on-the-fly calculations. The Quantics software [10] provides the platform for performing direct-dynamics propagation employing electronic-structure properties determined at several different levels of theory (HF\, DFT\, or CASSCF\, for example). The sampled nuclear configuration space is employed to build a “library” of potentials\, that will be used for generating the potential energy surfaces.\n3. Development of an interface for CTMQC. Based on the CTMQC module proposed during the Extended Software Develoment Workshop (ESDW) 7\, the interface will allow the evolution of the coupled trajectories according to the CTMQC equations based on electronic-structure information calculated from quantum-chemistry packages\, developing a connection between the E-CAM WP2 on electronic structure and WP3 on quantum dynamics. Potentially\, CTMQC can be adapted to the Quantics code\, since the latter has already been interfaced with several electronic-structure packages. Optimal scaling on HPC architectures is fundamental for maximizing efficiency.\n4. Extension of the QCLE module developed during the ESDW7 to high dimensions and general potentials. Two central issues need to be addressed to reach this goal : (i) the use of HPC infrastructures to efficiently parallelize the multi-trajectory implementation\, and (ii) the investigation of the stochastic sampling scheme associated with the electronic part of the time evolution. Progress in these areas will aid greatly in the development of this quantum dynamics simulation tool that could be used by the broader community.\n5. Development of a module to sample initial conditions for trajectory-based procedures. Based on the PaPIM module proposed during the ESDW7\, sampling of initial conditions from a Wigner distribution will be adapted to excited-state problems\, overcoming the usual approximation of a molecule pictured as a set of uncoupled harmonic oscillators. Also\, an adequate sampling of the ground vibrational nuclear wavefunction would ensure calculations of accurate photoabsorption cross-sections. This topic connects various modules of the E-CAM WP3 since it can be employed for CTMQC\, QCLE\, and for the surface-hopping functionality (SHZagreb developed during the ESDW7) of Quantics.\n6. Optimization of some of the modules for HPC facilities\, adopting hybrid OpenMP-MPI parallelization approaches. The main goal here is to be able to exploit different architectures by adapting different kinds of calculations\, e.g.\, classical evolution of trajectories vs. electronic-structure calculations\, to the architecture of the computing nodes. \nThe format and organization described here focuses specifically on the production of new modules. Parallel or additional activities\, e.g. transversal workshop on optimization of I/O with electronic structure code and possible exploitation of advanced hardware infrastructures (e.g. booster cluster in Juelich)\, will also be considered based on input from the community. \n[1] H. D. Meyer\, U. Manthe\, L. S. Cederbaum. Chem. Phys. Lett. 165 (1990) 73.\n[2] D. Lauvergant\, A. Nauts. Spectrochimica Acta Part A 119 (2014) 18.\n[3] J. C. Tully. Faraday Discuss. 110 (1998) 407.\n[4] S. K. Min\, F. Agostini\, I. Tavernelli\, E. K. U. Gross. J. Phys. Chem. Lett. 8 (2017) 3048.\n[5] R. Kapral. Annu. Rev. Phys. Chem. 57 (2006) 129.\n[6] G. A. Worth\, I. Burghardt. Chem. Phys. Lett. 368 (2003) 502.\n[7] B. Lasorne\, M. J. Bearpark\, M. A. Robb\, G. A. Worth. Chem. Phys. Lett. 432 (2006) 604.\n[8] M. Ben-Nun\, J. Quenneville\, T. J. Martínez. J. Phys. Chem. A 104 (2000) 5161.\n[9] J. Beutier\, D. Borgis\, R. Vuilleumier\, S. Bonella. J. Chem. Phys. 141 (2014) 084102.\n[10] Quantics. A suite of programs for molecular quantum dynamics. http://stchem.bham.ac.uk/~quantics/doc/\n[11] PaPIM. A code for calculation of equilibrated system properties (observables). http://e-cam.readthedocs.io/en/latest/Quantum-Dynamics-Modules/modules/PaPIM/readme.html
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-quantum-dynamics/
LOCATION:CECAM-FR-MOSER\, Maison de la Simulation\, Saclay\, France
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20180523
DTEND;VALUE=DATE:20180526
DTSTAMP:20260430T215138
CREATED:20180109T152604Z
LAST-MODIFIED:20180206T125813Z
UID:2262-1527033600-1527292799@www.e-cam2020.eu
SUMMARY:Scoping workshop: Building the bridge between theories and software: SME as a boost for technology transfer in industrial simulative pipelines
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nIn the computational chemistry/physics realm\, statistical mechanics\, electronic structure and multiscale modeling are three of the theoretical tools that enable understanding and modelling of physicochemical processes. Within these frameworks\, several theoretical/computational methods have been reported in the literature over the last decades. Despite their remarkable value in terms of novel ideas and theories \, such approaches are often far from a practical applicability within industrial settings [1]. This is mainly due to the fact that: i) these algorithms are often written in rather inefficient programming languages and therefore not fully optimized for new generation hardware architectures; ii) these methods can be very accurate from the physics standpoint\, being however quite far away from the industrial needs of finding a suitable tradeoff between speed and accuracy [2]. Often\, it can happen that experiments can paradoxically be faster (even though more expensive) than computational predictions. Therefore\, companies in different areas are actively seeking more reliable\, still rather fast\, computational methods to reduce the overall costs of industrial R&D pipelines. Just to mention a few examples\, this is the case for drug discovery\, where companies are looking for innovative approaches to accurate kinetics and thermodynamics predictions\, and the material industry\, where designing new nanostructures with improved features could greatly benefit from computational simulations . There exists\, however\, a clear and long-lasting gap between the theoretical chemistry/physics community and industries\, which are looking for efficient\, user-friendly\, and professional software solutions to be utilized in many different areas. Against this scenario\, small/medium enterprises (SMEs) that develop simulative software can play an increasingly key role in\, not only translating the science developed in academia via a proper technological transfer process\, but also in building a scientific bridge between the industry requirements in terms of automation and the new theories and algorithms developed at an academic level\, where it frequently happens that a systematic and practical exploitation of the algorithms is overlooked. It is crucial to remark that transforming academic algorithms into usable software is not only a matter of software engineering\, but often also means reconsidering the original theories and formalisms\, as a new algorithm working rather quickly and accurately on a system of a few hundred atoms\, will not necessarily be appropriate for more complex systems with huge numbers of degrees of freedom. In this context\, software development SMEs\, which have a clear mission towards top level science suitable for industrial settings\, may represent the missing link in the pipeline from-theory-to-software. \nIn the present E-CAM workshop we will discuss and dissect some key issues related to the aspects reported above. First\, we will try to answer the question: which is the most appropriate propelling element of innovation\, top-level academic science or industrial needs of accelerating R&D towards novel and cheaper products? Traditionally\, the approach to technology is conceiving technology as a corollary of scientific research. However\, there is compelling evidence that for several mid-term projects an industry-requirements-driven approach is largely feasible if not best suited. A tightly connected topic regards on how to match and synchronize curiosity driven research with industrial needs and how to manage the resulting\, possibly academic/industrial mixed\, intellectual property. Can this ‘engineering’ or ‘politechnique’ approach to science/technology transfer be the way to boost the technological SMEs European tissue? Furthermore\, considering the scale at which economical phenomena happen today\, in the workshop we will discuss whether European SMEs shall federate or merge and join efforts to reach the critical mass and create a significant reference point in the simulative domain worldwide (the American way?). Interestingly\, universities and research centers throughout Europe could be the seeds\, the starting glue\, for this aggregation process. To reach such goals\, the main keyword is coordination among the various CoE in Europe towards a coordinated scientific and technological strategy for the creation of professional software for industries and commercial exploitation.\nIn this E-CAM workshop\, we aim at creating a forum where top-level scientists of E-CAM with expertise in statistical mechanics\, multiscale modeling and electronic structure will discuss with representatives of pharmaceutical and material industries with the final objectives to identify the major gaps which still hamper a systematic exploitation of accurate computer simulations in industrial R&D. The presence of SMEs will also be crucial to understand whether potential gaps can be filled by small high-tech companies\, with the final objective to define a clear workflow and build a bridge between new theories and professional software solutions. As anticipated\, special attention will be given to the role of SMEs devoted to simulative software development. These may be key bridges between the academic developments and the creation of tools and interfaces easily transferrable to industrial partners. SMEs are\, however\, players with specific needs in the domain of intellectual property\, developing a viable business model\, and positioning themselves between academic research and industry. These aspects\, and the possible relationship with the CoEs will be addressed in the workshop. \n[1] Kuhn et al.\, “A Real-World Perspective on Molecular Design”\, J. Med. Chem.\, 2016\, 59 (9)\, pp 4087–4102 \n[2] Yibing Shan\, Eric T. Kim\, Michael P. Eastwood\, Ron O. Dror\, Markus A. Seeliger\, and David E. Shaw. “How Does a Drug Molecule Find Its Target Binding Site?” J. Am. Chem. Soc.\, 2011\, 133 (24)\, pp 9181–9183
URL:https://www.e-cam2020.eu/legacy_event/scoping-workshop-building-the-bridge-between-theories-and-software-sme-as-a-boost-for-technology-transfer-in-industrial-simulative-pipelines/
LOCATION:Instituto Italiano di Tecnolgia\, Genova\, Italy
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20180514
DTEND;VALUE=DATE:20180516
DTSTAMP:20260430T215138
CREATED:20180116T151857Z
LAST-MODIFIED:20180207T170131Z
UID:2351-1526256000-1526428799@www.e-cam2020.eu
SUMMARY:Scoping workshop: Solubility prediction
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here.\n\nWorkshop Description\n\nE-CAM is a H2020 project that aims to create\, develop and sustain a European infrastructure for computational science applied to simulation and modelling of materials and of biologicalprocesses of industrial and societal interest. Building on the already significant network of 15 CECAM centres across Europe and the PRACE initiative\, E-CAM creates a distributed centre for simulation and modelling across the electronic\, molecular and continuum length scales. The\ncenter builds on the considerable European expertise and capability in this area of significant industrial and scientific relevance. The objective is to make a very strong impact on the European economy through the development of a key industrial capability in the rapidly developing area of technological innovation through computer modelling.\n\nThe ambitious goals of E-CAM will be achieved through three complementary instruments: the development\, testing\, maintenance\, and dissemination of software targeted at end-user needs. It will also provide an environment for the long-term optimisation and maintenance of academic codes and will help to ensure that\, in future\, these codes are properly exploited by industry.\n\nCECAM will provide two scoping workshops per year. These will ensure a strong connect with our industrial partners. One of the two workshops will be broad in scope\, allowing industrial partners from different sectors to interact and to discuss new pilot projects across all of the four scientific work packages of E-CAM. The second workshop will be deep\, concentrating on one or two scientific areas of particular interest to a number of our partners.\n\nAt the Mainz scoping workshop in September 2016\, industrial partners expressed a strong interest in the problem of the calculation of the prediction of solubility and this will be the subject of this scoping workshop.\n\n\nIt has been reported that over 75% of drug development candidates have low solubility based on the Biopharmaceutics Classification System (BCS). An increasing trend towards low solubility is a major issue for drug development as formulation of low solubility compounds can be problematic. Despite tremendous efforts\, a definitive accurate and comprehensive approach to predicting solubility has proven elusive. Consequently\, there have been a number of attempts to probe changes in solubility as a function of structural changes in specific classes of molecules as well as systematic approaches looking at matched molecular pairs to determine improved solubility as a function of inferred crystal packing disruption. The focus of this workshop could be on the tools that allow an unprecedented deconstruction of the relative importance of molecular solvation and crystal packing on solubility. Recent work includes a systematic experimental approach to examine key thermodynamic functions such as sublimation and hydration properties as a function of structural modifications and a comprehensive computational approach to lattice energy estimation from molecular descriptors. A recent review has analysed simple predictive methods for the estimation of aqueous solubility and the specific use of a chemical informatics and theory to predict the solubility of drug like molecules [1]. A recent paper highlights the potential of these approaches and the attempts to build scientific bridges across the two communities. The paper [2] uses co-crystals to optimise the dissolution rate of a psychotropic drug with known dissolution challenges. \nAlgorithms for solubility calculations have been carried out by two different general approaches [3]: \n(1) the thermodynamic approach (of seeking the concentration at which the electrolyte chemical potential\, in solution\, is equal to that of the pure solid (2) a direct coexistence approach in which the solution is equilibrated with a solid configuration (typically either a slab or a selected crystal environment) and the electrolyte concentration in the solution phase sufficiently far from the crystal surface is taken to be the solubility [4]. \nThe algorithms for the calculation of solubility will be examined in detail at the workshop. Essentially\, the chemical potential of a salt\, in the solid phase is given by Gibbs free energy per molecule\, which in turn is related to the Helmholtz free energy of the solid estimated using the Einstein model and the molar volume of the solid at a fixed pressure\, which can be determined by performing constant-NpT simulations of the solid at room temperature. The chemical potential of the solution can be calculated from the derivative of the Gibbs free energy of solution with respect to the number of molecules\, The Gibbs free energy can be estimated using a coupling parameter method combined with a technique such a MBAR or WHAM [5]. The derivative is calculated numerically by performing a number of simulations at different solute concentrations. The solubility limit is obtained when the chemical potential of the solution and the solid are equal. \nIn the complementary area of structure activity relationships [6]\, we will discuss automatic model generation process for building QSAR models using Gaussian Processes\, a powerful machine learning modeling method. We will examine the stages of the process that ensure models are built and validated within a rigorous framework: descriptor calculation\, splitting data into training\, validation and test sets\, descriptor filtering\, application of modeling techniques and selection of the best model. We will explore the effectiveness of the automatic model generation process for two types of data sets commonly encountered in building ADME QSAR models\, a small set of in vivo data and a large set of physico-chemical data. \nReferences\n[1] D. Elder and R. Holm\, Int. J. Pharm.\, 453\, 3-11 (2013)\n[2] D. Elder\, R. Holm\, R. de Diego\, H. Lopez\, Int. J. Pharm.\, 453\, 88-100\,(2013).\n[3] I. Nezbeda\, F. Moucka and W. R. Smith\, Molecular Phys\, 1665-1690 (2016).\n[4] J. L. Aragones\, E. Sanz\, and C. Vega\, J.\, Chem. Phys. 136\, 244508 (2012).\n[5] R. Gozalbes\, \, A. Pineda-Lucena\, Bioorg Med Chem\, 18\, 7078–7084\, (2010).\n[6] J. Shaoxin Feng and Tonglei Li\, J. Chem. Theory Comput.\, 2\, 149-156 (2006).
URL:https://www.e-cam2020.eu/legacy_event/scoping-workshop-solubility-prediction/
LOCATION:CECAM-FR-RA\, Lyon and Grenoble\, France
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20180424
DTEND;VALUE=DATE:20180427
DTSTAMP:20260430T215138
CREATED:20180109T150123Z
LAST-MODIFIED:20180206T125701Z
UID:2254-1524528000-1524787199@www.e-cam2020.eu
SUMMARY:Scoping workshop: Dissipative particle dynamics - Where do we stand on predictive application?
DESCRIPTION:If you are interested in attending this event\, please visit the CECAM website here.\nWorkshop Description\nDissipative particle dynamics (DPD) has seen widespread uptake since its inception as a relatively simple and inexpensive coarse-grained modeling tool ideally suited to the study of soft condensed matter systems. DPD is perhaps unusual in that its development has been driven as much by the needs of industry as by academic research. We anticipate significant industrial participation\, therefore we propose to allocate plenty of space to address industrial relevant use-cases in the proposed program (note that two of the organizers are from industry). Despite the scientific advances and the early industrial applications\, there remain several open questions both in the foundations of the method and in advanced applications\, (some of which are listed below) that prevent the method being used in a predictive fashion in an industrial setting. \nWe propose to bring together the leaders in the field to ask the question\, where can DPD offer predictive insight currently\, and what is required to improve the method and application to enable improved predictive capability in the future? We aim to share insights\, identify approaches to solve key challenges\, and hone the ongoing active research programme. A key driver of the workshop is also to maintain a close community in this field across academia and industry\, necessary to move the field forwards. Note that the agenda of this workshop has been drafted to be in line with the E-CAM scoping workshop activities. \nWe aim to spend some time discussing the software landscape that supports the DPD community on the final day in the style of an E-CAM workshop and will touch upon where extreme scale computing can contribute. This is to ensure that the world-leading researchers in this field have are backed up by high quality software that is fit for purpose and to begin to bring the scientific leaders together with the leading software developers. Note that two members of the organizing committee are directly involved with the E-CAM project\, respectively as Supervisor and Member of the Executive Board. \nThis proposal follows on from an earlier workshop held in 2014 “Dissipative particle dynamics: foundations to applications”. This workshop brought the community together for the first time since 2008 to identify and to discuss the challenges in the field. Topics such as “Is a consensus emerging about how to parameterize the method?” and using DPD to couple between atomistic and continuum length scales were discussed with great interest. The community identified a number of key areas for future development with specific emphasis on the fact that DPD should begin to move from a descriptive to predictive method over the next few years. Hence the focus of the current proposal. In 2018 four years will have passed since the previous workshop\, in this time there have been a number of exciting developments in the parameterization of the DPD model and in the sophistication of the applications tackled with the method. We propose that now is a good time for the community to come together\, supported by a CECAM workshop\, to ask the question – is DPD moving to a predictive modeling and simulations tool for academics and industrial application. \nChallenges \nSome key barriers exist to applying DPD as a predictive model: \n● Do robust parameterization methods exist that enable predictive simulations?\n● Can such coarse-grained potentials be extended to different families of compounds or are they molecule/system-dependent?\n● Is the application of electrostatics in DPD solved or not?\n● How do we treat solvents of different nature?\n● Do many-body method play an important role in predictive applications?\n● What is the real computational gain in DPD? Time and length scales?\n● Many industrial applications of DPD involve interactions with surfaces\, can DPD provide realistic representation of these?\n● Does the software exist to support predictive simulations?\n● Do we have analytics to extract appropriate data from simulations\, e.g.\, viscosity \nOutputs \nWe would like to ensure the workshop is more than just a collection of talks. To this end we will to identify two or three key challenges and construct a roadmap to approaching them. This roadmap would form the basis for further meetings to discuss progress of those key challenges\, e.g.\, a UK-based meeting hosted by CCP5 in 2019 with both academic and industrial participation. \nWithout prejudicing the outcome we expect that one of the challenges will be a consensus on parameterization methodology. A second key challenge could be focused towards a particular application area\, for example surfactant phase science\, rheology\, interfacial structures\, etc.
URL:https://www.e-cam2020.eu/legacy_event/scoping-workshop-dissipative-particle-dynamics-where-do-we-stand-on-predictive-application/
LOCATION:CECAM-UK-HARTREE\, Daresbury\, United Kingdom
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20170918
DTEND;VALUE=DATE:20170930
DTSTAMP:20260430T215138
CREATED:20161217T151156Z
LAST-MODIFIED:20190110T154203Z
UID:595-1505692800-1506729599@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Meso and multiscale modeling
DESCRIPTION:If you are interested in attending this workshop\, please visit the CECAM website bellow.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-meso-and-multiscale-modeling-2/
LOCATION:CECAM-DE-MMS Node\, Germany
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20170918
DTEND;VALUE=DATE:20170921
DTSTAMP:20260430T215138
CREATED:20161215T144658Z
LAST-MODIFIED:20190110T154222Z
UID:579-1505692800-1505951999@www.e-cam2020.eu
SUMMARY:Scoping Workshop: From the Atom to the Molecule
DESCRIPTION:If you are interested in attending this workshop\, please visit the CECAM website bellow.
URL:https://www.e-cam2020.eu/legacy_event/scoping-workshop-from-the-atom-to-the-molecule/
LOCATION:CECAM-UK-JCMAXWELL Node\, United Kingdom
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20170814
DTEND;VALUE=DATE:20170826
DTSTAMP:20260430T215138
CREATED:20161215T142851Z
LAST-MODIFIED:20190110T154238Z
UID:573-1502668800-1503705599@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Classical Molecular Dynamics
DESCRIPTION:If you are interested in attending this workshop\, please visit the CECAM website bellow.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-classical-molecular-dynamics/
LOCATION:Lorentz Centre\,  Leiden \, Netherlands
CATEGORIES:E-CAM event
ATTACH;FMTTYPE=image/jpeg:https://www.e-cam2020.eu/wp-content/uploads/2016/12/IMG_1535-e1502886791331.jpg
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20170717
DTEND;VALUE=DATE:20170729
DTSTAMP:20260430T215138
CREATED:20161217T150456Z
LAST-MODIFIED:20190110T154252Z
UID:589-1500249600-1501286399@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Quantum MD
DESCRIPTION:If you are interested in attending this workshop\, please visit the CECAM website bellow.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-quantum-md/
LOCATION:CECAM-IRL Node\, Kingston Glebe\, Clifden\, Connemara\, Clifden\, H71WY19\, Ireland
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20170706
DTEND;VALUE=DATE:20170708
DTSTAMP:20260430T215138
CREATED:20170327T132617Z
LAST-MODIFIED:20190110T154311Z
UID:871-1499299200-1499471999@www.e-cam2020.eu
SUMMARY:Extreme-Scale State-of-the-Art Workshop
DESCRIPTION:Goals of the Workshop: \nThe central goal of the 1st E-CAM Extreme-Scale State-of-the-art Workshop is to provide a forum for fellow E-CAM application end users and developers to: \n\nIdentify emerging extreme-scale computing requirements across the centre\, including from both academia and industry partners\nIncrease the centre’s awareness of current and emerging HPC hardware and software technologies on the road to exascale computing\nIncrease the centre’s awareness of PRACE services (Advanced Training\, software enablement\, and industry interactions)\nInterface with other members of the European HPC community\nIdentify themes of future interest for the centre on the road to exascale computing\n\nIf you wish to apply for this workshop please do so through the CECAM website here.
URL:https://www.e-cam2020.eu/legacy_event/e-cam-extreme-scale-state-of-the-art-workshop/
LOCATION:Faculty of Physics\, University of Barcelona\, Carrer Marti i Franques 1\, Barcelona\, 08028 \, Spain
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20170703
DTEND;VALUE=DATE:20170715
DTSTAMP:20260430T215138
CREATED:20161217T150911Z
LAST-MODIFIED:20190110T154335Z
UID:592-1499040000-1500076799@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Meso and multiscale modeling
DESCRIPTION:If you are interested in attending this workshop\, please visit the CECAM website below.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-meso-and-multiscale-modeling/
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20170529
DTEND;VALUE=DATE:20170601
DTSTAMP:20260430T215138
CREATED:20161215T144255Z
LAST-MODIFIED:20190110T154352Z
UID:575-1496016000-1496275199@www.e-cam2020.eu
SUMMARY:State of the Art Workshop: Meso and Multiscale Modelling
DESCRIPTION:If you are interested in attending this workshop\, please visit the CECAM website bellow.
URL:https://www.e-cam2020.eu/legacy_event/state-of-the-art-workshop-meso-and-multiscale-modelling/
LOCATION:O’Brien Centre for Science\, University College Dublin\, Dublin\, Ireland
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20161114
DTEND;VALUE=DATE:20161126
DTSTAMP:20260430T215138
CREATED:20160515T220507Z
LAST-MODIFIED:20190110T154408Z
UID:277-1479081600-1480118399@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Trajectory Sampling
DESCRIPTION:This is the 3rd of E-CAM’s extended software development workshops; this one on the theme of trajectory sampling.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-trajectory-sampling/
LOCATION:Unnamed Venue\,  Traunkirchen\, Austria
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20160912
DTEND;VALUE=DATE:20160917
DTSTAMP:20260430T215138
CREATED:20160718T134955Z
LAST-MODIFIED:20190110T154422Z
UID:359-1473638400-1474070399@www.e-cam2020.eu
SUMMARY:Extended Software Development Workshop: Wannier90
DESCRIPTION:The aim of the workshop is to share recent developments related to the generation and use of maximally-localised Wannier functions and to either implement these developments in\, or interface them to\, theWannier90 code. It will also be an opportunity to improve and update existing interfaces to other codes and write new ones. The format will be deliberately open\, with the majority of the time allocated for coding and discussion.
URL:https://www.e-cam2020.eu/legacy_event/extended-software-development-workshop-wannier90/
LOCATION:San Sebastian\, Spain
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20160912
DTEND;VALUE=DATE:20160915
DTSTAMP:20260430T215138
CREATED:20160515T220258Z
LAST-MODIFIED:20190110T154438Z
UID:274-1473638400-1473897599@www.e-cam2020.eu
SUMMARY:State of the art workshop: Electronic Structure
DESCRIPTION:This is the third state of the art workshop for 2016. It is organised by the CECAM-UK-HARTREE node and will focus on electronic structure. Scoping workshops provide a forum to survey new methods and developments in simulation. These workshops inform the software that will be developed for the E-CAM library.
URL:https://www.e-cam2020.eu/legacy_event/state-of-the-art-workshop-electronic-structure/
LOCATION:Daresbury Laboratory\, Daresbury\, United Kingdom
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20160907
DTEND;VALUE=DATE:20160910
DTSTAMP:20260430T215138
CREATED:20160719T071652Z
LAST-MODIFIED:20190110T154457Z
UID:364-1473206400-1473465599@www.e-cam2020.eu
SUMMARY:E-CAM Scoping Workshop: Simulation and Modelling in Industry
DESCRIPTION:The meeting has been especially designed for industry and will focus on three main areas\, providing: \n\na) A detailed account of the state of the art our key areas of simulation and of data-driven modelling\nb) A discussion of the direction of the E-CAM project in years 2 and 3 to ensure that the project is addressing the needs of our industrial partners\nc) A forum for sharing best practices in simulation and modelling in industrial environments\, including considerations of hardware and robust software.
URL:https://www.e-cam2020.eu/legacy_event/e-cam-scoping-workshop-simulation-and-modelling-in-industry/
LOCATION:Max Planck Institute for Polymer Research\, Mainz\, Germany
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20160829
DTEND;VALUE=DATE:20160903
DTSTAMP:20260430T215138
CREATED:20160515T215946Z
LAST-MODIFIED:20190110T153604Z
UID:268-1472428800-1472860799@www.e-cam2020.eu
SUMMARY:Reaction Coordinates from Molecular Trajectories
DESCRIPTION:This is the second E-CAM’s state of the art workshops\, providing a discussion of developments in the field and assessing the impact of developed software on the academic community.
URL:https://www.e-cam2020.eu/legacy_event/reaction-coordinates-from-molecular-trajectories/
LOCATION:Lorentz Centre\,  Leiden \, Netherlands
CATEGORIES:E-CAM event
END:VEVENT
BEGIN:VEVENT
DTSTART;VALUE=DATE:20160627
DTEND;VALUE=DATE:20160709
DTSTAMP:20260430T215138
CREATED:20160515T215722Z
LAST-MODIFIED:20190110T154522Z
UID:265-1466985600-1468022399@www.e-cam2020.eu
SUMMARY:Quantum Mechanics and Electronic Structure
DESCRIPTION:This is the second of E-CAM’s Extended Software Development Workshops ESDW)\, which will take place in Paris. ESDW’s are training events  that include coding sessions and training lectures on computer hardware and advances in new architecture\, parallel programming techniques and more.
URL:https://www.e-cam2020.eu/legacy_event/quantum-mechanics-and-electronic-structure/
LOCATION:Maison de la Simulation\, Paris\, France
CATEGORIES:E-CAM event
ATTACH;FMTTYPE=image/jpeg:https://www.e-cam2020.eu/wp-content/uploads/2016/04/graphic-ecam-wide.jpg
END:VEVENT
END:VCALENDAR