Workshop "Statistical methods for safety and decommissioning"
- When & where
- Presentation of the workshop
- Confirmed speakers
- Call for posters
When & where
The workshop will take place on November 21-22, 2022 at Campus Hannah Arendt, Université d'Avignon, Avignon. It is located on the centre of Avignon, close to the railway station (1km).
Presentation of the workshop
This conference aims to bring together researchers and engineers from the academic and industrial sectors to address the use of statistical methods for industrial safety and decommissioning challenges. Four topics will be covered: spatial statistics for pollution characterization, measurements and uncertainty, metamodelling techniques and risk analysis techniques.
A poster and drinks session will be held at the end of the first day to allow participants to present their work.
The languages will be French and English.
Organizers: Marielle Crozet (CEA Marcoule, CETAMA), Céline Helbert (Ecole Centrale de Lyon, Institut Camille Jordan), Bertrand Iooss (EDF R&D), Céline Lacaux (Université d'Avignon, Lab. de Mathématiques d'Avignon).
Sponsored by: CEA, CETAMA, LMA, Avignon Université, GDR MASCOT-NUM
- Sophie Ancelet (IRSN Fontenay-aux-Roses): Hierarchical modeling and Bayesian statistics for a better consideration of uncertainties when estimating radiation-related risks
- François Bachoc (Institut de Mathématiques de Toulouse): Introduction to Gaussian process with inequality constraints - Application to coast flooding risk
- Emanuele Borgonovo (Bocconi University): Reliability importance via optimal transport
- Nicolas Bousquet (EDF R&D): Risk, uncertainty, and robust decision-making: an attempted Introduction
- Aloïs Clément (CEA Valduc): Bayesian Approach for Multigamma Radionuclide Quantification Applied on Weakly Attenuating Nuclear Waste Drums
- Jean-Philippe Dancausse, Magali Saluden et Catherine Eysseric (CEA Marcoule, DES/DDSD): Expected contributions of statistical methods to nuclear decommissioning of CEA facilities
- Michele Désenfant (LNE): A measurement process is not a deterministic algorithm!
- Yvon Desnoyers (Geovariances): Smart use of the variogram to explore spatial data, to break down variance contributions and to model radiological contaminations
- Mélanie Ducoffe (Airbus Group): Verification of overestimation and partial monotonicity for neural network-based surrogate for aircraft braking distance estimation
- Mitra Fouladirad (Ecole Centrale de Marseille): Wind speed modelling with stochastic processes for wind turbine production study
- Amandine Marrel (CEA Cadarache, DES/DER): Statistical approach in nuclear safety problems: recent advanced around sensitivity analysis and metamodeling
- Claude Norman and Sarah Michalak (IAEA): Reconcilier l’estimation d’incertitudes de mesure ascendante basée sur le GUM (Guide for the Expression of Uncertainty in Measurement) et l’estimation descendante basée sur le modèle statistique de l’IAEA
- Thomas Romary (Mines ParisTech): Scenario reduction for uncertainty quantification in Uranium in situ recovery
Call for posters
A poster session is organised on November, 21. If you want to participate please send an email to Céline Lacaux with the poster title, the authors and a few words on the subject.
Registration is free but mandatory. The deadline for registration is November, 1st, 2022
- François Bachoc: Introduction to Gaussian process with inequality constraints, Application to coast flooding risk - In Gaussian process modeling, inequality constraints enable to take expert knowledge into account and thus to improve prediction and uncertainty quantification. Typical examples are when a black-box function is bounded or monotonic with respect to some of its input variables. We will show how inequality constraints impact the Gaussian process model, the computation of its posterior distribution and the estimation of its covariance parameters. An example will be presented, where a numerical flooding model is monotonic with respect to two input variables called tide and surge.
- Nicolas Bousquet: Risk, uncertainty, and robust decision-making: an attempted Introduction - This introductory presentation will attempt to provide an overarching framework for many of the different contributions that will be presented. Reminders on the formalization of risk and decision, the evidence of the crucial influence of uncertainties and the need to study robustness will be illustrated by case studies from different engineering domains. Approaches using penalized models and non-asymptotic statistical tools will also be presented to partially solve these problems. I will conclude with some proposals for challenges to improve the treatment of uncertainties and decision making, in which many contributions seem to be a priori included.
- Emanuele Borgonovo: Reliability importance via optimal transport - In this presentation, we will review the notion of reliability importance. We will discuss both local and global approaches. We will then focus on the application of the theory of optimal transport for the construction of reliability importance measures, with applications in safety and risk assessment
- Aloïs Clément: Bayesian Approach for Multigamma Radionuclide Quantification Applied on Weakly Attenuating Nuclear Waste Drums - See https://ieeexplore.ieee.org/document/9500209
- Michele Desenfant : A measurement process is not a deterministic algorithm! - We are sensitive to the temporal, spatial variation, the variation of the models but let's not forget the uncertainty associated with a measurement result.
- Yvon Desnoyers: Smart use of the variogram to explore spatial data, to break down variance contributions and to model radiological contaminations - Before performing any modelling and estimation calculation, the first and main part of any geostatistical study is to intensively work on the dataset, to explore and validate it. In addition to classical statistics tools such as basemap, histogram and scatter plot, the variogram strengthens this analysis by the identification of spatial inconsistencies, by the decomposition of the different variability contributions (between sample duplicates, measurement replicates and spatial variability) and consequently by the interpretation and modelling of the spatial structure. Illustrated on several radiological contamination cases, this presentation will describe and detail the advanced and smart use of the variogram.
- Mélanie Ducoffe: Verification of overestimation and partial monotonicity for neural network-based surrogate for aircraft braking distance estimation - In recent years, we have seen the emergence of safety-related properties for regression tasks in many industries. For example, numerical models have been developed to approximate the physical phenomena inherent in their systems. Since these models are based on physical equations, the relevance of which is affirmed by scientific experts, their qualifications is carried out without any problems. However, as their computational costs and execution time prevent us from embedding them, the use of these numerical models in the aeronautical domain remains mainly limited to the development and design phase of the aircraft. Thanks to the current success of deep neural networks, previous works have already studied neural network-based surrogates for the approximation of numerical models. Nevertheless, these surrogates have additional safety properties that need to be demonstrated to certification authorities. In this talk, we will examine two specifications that arise for a neural network used for aircraft braking distance estimation: over-prediction of the simulation model and partial monotonicity. We will explore two ongoing research directions to address them: probabilistic evaluation using Bernstein-type deviation inequalities and formal verification of neural networks using mixed integer programming and linear relaxation based perturbation analysis.
- Mitra Fouladirad: Wind speed modelling with stochastic processes for wind turbine production study - The main issue in wind turbine production, reliability and degradation analysis is the wind speed short and long-term forecasting for different geographical sites. Indeed, according to different positions of wind turbines in a wind farm they are not subject to the same wind speed and the production or reliability analysis based on an average wind speed value is not very sensible. Although, geographical wind data are available, for an efficient production or reliability analysis, it is essential to model the wind speed and to be able to generate data faster than CFD methods. In this study, based on prior information and clustering methods wind data are analysed and an Ornstein-Uhlenbeck process with covariates is proposed for wind modelling. The validation is analysed through depth functions and bootstrap methods.
- Amandine Marrel (CEA Cadarache, DES/IRESNE/DER): Statistical approaches in nuclear safety problems: recent advanced around sensitivity analysis and metamodeling - In safety or conception studies on current or future nuclear reactors, we make extensive use of simulation to model, understand and predict the phenomena involved. However, these simulators can take a large number of uncertain input parameters, characterizing the studied phenomenon or related to its physical and numerical modelling. Statistical methods based on Monte Carlo approaches thus provide a rigorous framework to deal with these uncertainties. However, several technical locks have to be addressed, such as the large dimension of the problem (several tens of inputs, e.g.), the complex inputs/output(s) relationship and the high CPU computation time of considered simulators. Sensitivity analysis and metamodeling techniques play a key role to overcome these limitations. The presentation will focus on recent and advanced developments on these topics, illustrated by their application on nuclear use cases.
- Claude Norman and Sarah Michalak (AIEA): Reconcilier l’estimation d’incertitudes de mesure ascendante basée sur le GUM (Guide for the Expression of Uncertainty in Measurement) et l’estimation descendante basée sur le modèle statistique de l’AIEA - Les Valeurs Cibles Internationales (VCI), plus communément désignées en anglais par l’appellation International Target Values (ITV), sont les valeurs de précision de mesure qui devraient être réalisables dans des conditions industrielles normales pour différentes méthodes de mesure et différents types de matières nucléaires. Ces dernières sont régulièrement revues et publiées par le Département des garanties de l’Agence internationale de l’énergie atomique (AIEA) afin de servir de référence aux exploitants d’installations du cycle du combustible nucléaire. Ces valeurs cibles sont également utilisées comme référence par les inspecteurs des garanties de l’AIEA qui utilisent des méthodes de mesure destructives et non-destructives pour vérifier les déclarations des États. À l’occasion de la réunion des experts internationaux chargés de la préparation de l’édition 2010 des Valeurs Cibles Internationales (ITV-2010), une discussion animée a eu lieu entre les représentants des laboratoires et les évaluateurs de l’AIEA. Alors que les premiers utilisent les méthodes bottom-up (processus ascendant) et la terminologie du Guide pour l’expression de l’incertitude de mesure (GUM), l’AIEA produit les VCI sur la base d’un modèle statistique top-down (processus descendant) exprimant séparément les variances des erreurs aléatoires et systématiques. Afin de réconcilier ces deux approches, un effort de communication important a été entrepris entre les différentes communautés intéressées (laboratoires, inspecteurs et évaluateurs de l’AIEA, Euratom et ABACC, représentants des États pour les garanties…). Cela s’est concrétisé par la rédaction d’un article collaboratif ayant pour titre « Statistical error model-based and GUM based analysis of measurement uncertainties in nuclear safeguards – a reconciliation », habituellement désigné sous le nom de « Reconciliation Paper ». Les conclusions principales de cet article sont d’une part que le GUM et le modèle statistique de l’AIEA ne sont pas contradictoires mais complémentaires et mathématiquement compatibles, et d’autre part que leur synergie peut servir à alimenter les modèles utilisés par chacune des deux approches.
- Thomas Romary: Scenario reduction for uncertainty quantification in Uranium in situ recovery - Uranium In Situ Recovery (ISR) is based on the direct leaching of the uranium ore in the deposit by a mining solution. Fluid flow and geochemical reactions in the reservoir are difficult to predict due to geological, petrophysical and geochemical uncertainties. The reactive transport code used to simulate ISR is very sensitive to the spatial distribution of physical and chemical properties of the deposit. Stochastic geostatistical models are used to represent the uncertainty on the spatial distribution of geological properties. The direct propagation of geological uncertainties by multiple ISR mining simulations is intractable in an industrial context, the CPU time needed to perform one ISR numerical simulation being too heavy. We will present a way to propagate geological uncertainties into uranium production uncertainties at a reduced computational cost, thanks to a scenario reduction method.