The plenty of data available, as never before in history, in many scientific contexts, is imposing what many now define the "fourth paradigm of science". At the same time, some projects (e.g., The USA Exascale Computing Project and The European Technology Platform in HPC) that plan to create "computing ecosystems" capable of guaranteeing computational powers of the order of 10^18 operations per second, are beginning to bear fruit. In order for scientific research to concretely take advantage of all this "abundance", it is necessary to rethink the tools that have up to now supported scientific computing, for example: 1) (re) designing the algorithms which, in the light of the new infrastructures for supercomputing characterized by a very high number of components and a great heterogeneity, must respond to high levels of granularity and locality (e.g., the communication-avoiding algorithms for the solution of linear systems); 2) implementing methods that can make use of full decomposition of the models 's spacetime domain (e.g., the "parallel in time" algorithms for the solution of the differential equations describing evolutionary phenomena); 3) overcoming the classic techniques of "data assimilation" in correction/definition of the models to develop new integration strategies between these and the data (e.g., the use of "machine learning" techniques for data-driven discovery of partial differential equations).
NEW METHODS AND ALGORITHMS FOR THE "FOURTH PARADIGM OF SCIENCE" IN THE ERA OF DELUGE OF DATA AND COMPUTING POWER
Luisa Carracciuolo
2020
Abstract
The plenty of data available, as never before in history, in many scientific contexts, is imposing what many now define the "fourth paradigm of science". At the same time, some projects (e.g., The USA Exascale Computing Project and The European Technology Platform in HPC) that plan to create "computing ecosystems" capable of guaranteeing computational powers of the order of 10^18 operations per second, are beginning to bear fruit. In order for scientific research to concretely take advantage of all this "abundance", it is necessary to rethink the tools that have up to now supported scientific computing, for example: 1) (re) designing the algorithms which, in the light of the new infrastructures for supercomputing characterized by a very high number of components and a great heterogeneity, must respond to high levels of granularity and locality (e.g., the communication-avoiding algorithms for the solution of linear systems); 2) implementing methods that can make use of full decomposition of the models 's spacetime domain (e.g., the "parallel in time" algorithms for the solution of the differential equations describing evolutionary phenomena); 3) overcoming the classic techniques of "data assimilation" in correction/definition of the models to develop new integration strategies between these and the data (e.g., the use of "machine learning" techniques for data-driven discovery of partial differential equations).I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.