Commitees & GDR Partners (2012 - 2015)

You can also see the commitees and partners of the GDR from 2008 to 2011.

Administrative rattachments

The GDR MASCOT-NUM is a CNRS research entity (France), emanating from the INSMI (Mathematical sciences and its interactions) institute.

GDR Coordinators

Communication committee

Scientific committee

/* Mark Asch (Université de Picardie, France) */

Organizing committee

Working group coordinators

<sub>
The extensive use of numerical simulations opens the ways for new types of experimentations and designs of experiments: more intensive explorations become possible, more hazardous configurations can be tested and, hopefully, better understanding and optimized responses can be achieved together with more accurate statements about risk and failures. At the same time, complex simulations require long computations, which sets a limitation on what can be learnt in reasonable time.
The domain of design and analysis of computer experiments aims at defining what should be chosen for the inputs of a numerical model in order to achieve a prescribed objective. In particular, one may want to:
(i) predict the behavior of a numerical model from the results of a small number of runs;
(ii) optimize the response of a numerical model; that is, determine the values ​​of inputs corresponding, for example, to the highest performance or smallest cost;
(iii) estimate the variability of a response as a function of that of the inputs (also known as sensitivity analysis);
(iv) estimate a probability of failure in presence of uncertainties when some inputs are randomized with a given probability measure.
Whereas space-filling designs are commonly used for the first objective, different types of designs may be more relevant in other situations. Sequential strategies (or active learning) that construct a model of the numerical simulator step by step, are especially attractive. The topics considered in this Working Group cover the definition of design criteria related to a given objective, the construction of efficient algorithms for the determination of optimal experiments, the investigation of asymptotic properties of designs, the construction of designs for dealing with simulators with several levels of predictive accuracy. Experiments for real physical systems, where in general purely random errors corrupt the observations, are also considered.</sub>
<sub>Uncertainty analysis aims to understand the impact of input variables or noise on these input variables on the code output variables.
Usually, deterministic tools of differential calculus are used.
For example, a basic index is obtained via the derivative of one output respective to one imput.
Such approach remains local because derivatives are generally computed at specific points.
In this deterministic approach, we are interested by the automatic differentiation tools which aim at efficiently compute complex code derivatives.
The stochastic approach of uncertainty analysis aims at studying global criteria based on joint pdf modelisation of the problem variables.
The obtained sensitivity indices describe the global variabilities of the phenomena.
For example, the Sobol sensitivity index is given by the ratio between the ouput variance conditionally to one input and the total output variance.
Computation of such quantities leads to very interesting statistical problems that we propose to study.
For example, the efficient estimation of sensitivity indices from a few runs relates to semi or non-parametric estimation techniques.
The stochastic modelisation of the input/output relation ship is another solution.
We can look for models with specific properties (parcimonious representation using ad hoc response surfaces, having remarkable algebraic properties as orthogonality, etc).</sub>
<sub>Statistical techniques or stochastic numerical methods can be used to build approximations of expensive computer codes. These statistical approximations, or metamodels, or response surfaces, are computationally cheap and they can then be used to replace the actual computer codes to tackle problems such as uncertainty analysis, sensitivity analysis, probabilistic inverse problems or multiobjective optimization. A wide variety of response surfaces exist: linear, polynomial, spline-based, fitted with neural networks or with kernel functions lying in Reproducing Kernel Hilbert Spaces (RKHS) like SVM, kriging functions or Gaussian processes (bayesian framework). Each model is more or less well-suited to the different practical problems. However, applications to real and/or industrial problems raise several problems: • What can we do when the number of input variables is really large? • How to simultaneously deal with several output variables? • How can we deal with multi scale-problems? • How to consider the anisotropy problems? • What can we do when the code outputs are curves (functional data)?</sub>
<sub>
</sub>
<sub>Uncertainty spans over a wide-encompassing spectrum of science and practical knowledge : economics, physics, decision theory and risk assessment are the traditional sources, and more recently modelling, numerical analysis, advanced statistics and computer science; but it even reaches epistemology, management science, psychology or public debate and democracy. The subject has a strategic interest in modelling, either driven by regulatory demands (e.g., in safety, security or environmental control in the certification or licensing processes), or by markets (e.g., industrial process optimization or business development), whereby a better understanding of the margins of uncertainty and possible actions for their reduction are being increasingly investigated for associated risks and opportunities. As modelling and simulations are more and more popular in the industrial pratice, many industrial challenges arise when pratictioners want to deal with uncertainty and sensitivity analysis.
Within this group, the first goal is to guarantee the applicability and the transfer of the previous scientific techniques to current difficulties encountered by industrial practitioners. On the other hand, the different industrial partners will propose current challenges to the scientific community. This concerns for example: high dimensionality, computational times, coupling between different codes, scarcity of observational data, special repartition in the space of the factors.
Workshops and dedicated sessions will be organised to ease a fruitful dialog between the different communities.</sub>
<sub>The scientific backdrop of this working group is the design, modelling and analysis of complex models and/or forecasting systems for environmental applications: climate change, regional forecasting systems for the ocean and the atmosphere, evolution of air and water quality, agri-environment interactions, ... The overall applicative aim is to contribute to a better understanding, forecasting and possibly control of such systems.
A number of specific features may arise in such applications :
(i) A fundamental aspect of many geophysical phenomena is the strong interactions between scales (spatial and temporal), and the associated cascade of energy, which of course complicates their modelling.
(ii) An additional source of complexity may arise when the stochastic nature of some parts of the systems cannot be neglected, in particular when rare and extreme events must be detected and allowed for.
(iii) Most of the time, the model is discretized over a huge grid (sometimes with millions of points). We need therefore to perform efficient model reduction methods, in particular to be able to implement a sensitivity analysis on these models.
(iv) Moreover, forecasting systems often combine different sources of information (numerical model, direct observations, statistics, images...) with the help of data assimilation techniques. Such techniques improve the forecasting skill of the system, but make it more complex to analyze.
(v) Performing and studying models which take into account the spatio-temporal dynamic of the phenomena is of great importance. Such models often involve functional inputs and/or outputs. Various methods should be developed in that sense.
The development of efficient methods to study environmental problems requires taking these features into account, a goal which implies pluridisciplinarity. A key aspect of this working group is namely to gather specialists from different domains, coming either from the applied mathematics world (numerical tools, stochastic point of view, data base management), or from the many fields of expertise associated with environment applications (in particular geophysicists, biophysicists, hydrologists, ...).</sub>

GDR Partners