The paper analyses the capability of the present wave models of properly reproducing the conditions during and at the peak of severe and extreme storms. After providing evidence that this is often not the case, the reasons for it are explored. First the physics of waves considered in wave models is analysed. Although much improved with respect to the past, the wind accuracy is still a relevant factor at the peak of the storms. Other factors such as wind variability and air density are considered. The classical theory of wave generation by Miles' mechanism, with subsequent modifications, is deemed not sufficiently representative of extreme conditions. The presently used formulations for nonlinear energy transfer are found to lead to too wide distributions in frequency and direction, hence reducing the input by wind. Notwithstanding some recent improvements, the white-capping formulation still depends on parameters fitted to the bulk of the data. Hence it is not obvious how they will perform in extreme conditions when the physics is likely to be different. Albeit at different levels in different models, the advection still implies the spreading of energy, hence a spatial smoothing of the peaks. The lack of a proper knowledge of the ocean currents is found to substantially affect the identification of how much energy can in some cases be concentrated at given time and location. The implementation of the available theories and knowhow in the present wave models is often found inconsistent from model to model. It follows that in this case it is not possible to exchange corresponding pieces of software between two models without affecting substantially the quality of the results. After analyzing various aspects of a wave model, the paper makes some general considerations. As wave growth is the difference between processes, input and output, involving large amounts of energy, it is very sensitive to small modifications of one or more processes. Together with the strong, but effective, tuning present in a wave model, this makes the introduction of new physics more and more complicated. It is suggested that for long term improvements, operational and experimental applications need to proceed along parallel routes, the latter looking more to the physics without the necessity of an immediately improved overall performance. In view of the forthcoming increase of computer power, a sensitivity study is suggested to identify the most critical areas in a wave model where to invest for further improvements. The limits on the description of the physics of the processes when using the spectral approach, particularly in extreme conditions, are considered. For further insights and as a way to validate the present theories in these conditions, the use is suggested of numerical experiments simulating in great detail the physical interaction between the lower atmosphere and the single waves.
Wave modeling missing the peaks
Cavaleri L
2009
Abstract
The paper analyses the capability of the present wave models of properly reproducing the conditions during and at the peak of severe and extreme storms. After providing evidence that this is often not the case, the reasons for it are explored. First the physics of waves considered in wave models is analysed. Although much improved with respect to the past, the wind accuracy is still a relevant factor at the peak of the storms. Other factors such as wind variability and air density are considered. The classical theory of wave generation by Miles' mechanism, with subsequent modifications, is deemed not sufficiently representative of extreme conditions. The presently used formulations for nonlinear energy transfer are found to lead to too wide distributions in frequency and direction, hence reducing the input by wind. Notwithstanding some recent improvements, the white-capping formulation still depends on parameters fitted to the bulk of the data. Hence it is not obvious how they will perform in extreme conditions when the physics is likely to be different. Albeit at different levels in different models, the advection still implies the spreading of energy, hence a spatial smoothing of the peaks. The lack of a proper knowledge of the ocean currents is found to substantially affect the identification of how much energy can in some cases be concentrated at given time and location. The implementation of the available theories and knowhow in the present wave models is often found inconsistent from model to model. It follows that in this case it is not possible to exchange corresponding pieces of software between two models without affecting substantially the quality of the results. After analyzing various aspects of a wave model, the paper makes some general considerations. As wave growth is the difference between processes, input and output, involving large amounts of energy, it is very sensitive to small modifications of one or more processes. Together with the strong, but effective, tuning present in a wave model, this makes the introduction of new physics more and more complicated. It is suggested that for long term improvements, operational and experimental applications need to proceed along parallel routes, the latter looking more to the physics without the necessity of an immediately improved overall performance. In view of the forthcoming increase of computer power, a sensitivity study is suggested to identify the most critical areas in a wave model where to invest for further improvements. The limits on the description of the physics of the processes when using the spectral approach, particularly in extreme conditions, are considered. For further insights and as a way to validate the present theories in these conditions, the use is suggested of numerical experiments simulating in great detail the physical interaction between the lower atmosphere and the single waves.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.