Internal waveInternal waves are gravity waves that oscillate within a fluid medium, rather than on its surface. To exist, the fluid must be stratified: the density must change (continuously or discontinuously) with depth/height due to changes, for example, in temperature and/or salinity. If the density changes over a small vertical distance (as in the case of the thermocline in lakes and oceans or an atmospheric inversion), the waves propagate horizontally like surface waves, but do so at slower speeds as determined by the density difference of the fluid below and above the interface.
Boussinesq approximation (water waves)In fluid dynamics, the Boussinesq approximation for water waves is an approximation valid for weakly non-linear and fairly long waves. The approximation is named after Joseph Boussinesq, who first derived them in response to the observation by John Scott Russell of the wave of translation (also known as solitary wave or soliton). The 1872 paper of Boussinesq introduces the equations now known as the Boussinesq equations. The Boussinesq approximation for water waves takes into account the vertical structure of the horizontal and vertical flow velocity.
Automated machine learningAutomated machine learning (AutoML) is the process of automating the tasks of applying machine learning to real-world problems. AutoML potentially includes every stage from beginning with a raw dataset to building a machine learning model ready for deployment. AutoML was proposed as an artificial intelligence-based solution to the growing challenge of applying machine learning. The high degree of automation in AutoML aims to allow non-experts to make use of machine learning models and techniques without requiring them to become experts in machine learning.
Transformer (machine learning model)A transformer is a deep learning architecture that relies on the parallel multi-head attention mechanism. The modern transformer was proposed in the 2017 paper titled 'Attention Is All You Need' by Ashish Vaswani et al., Google Brain team. It is notable for requiring less training time than previous recurrent neural architectures, such as long short-term memory (LSTM), and its later variation has been prevalently adopted for training large language models on large (language) datasets, such as the Wikipedia corpus and Common Crawl, by virtue of the parallelized processing of input sequence.
Spontaneous symmetry breakingSpontaneous symmetry breaking is a spontaneous process of symmetry breaking, by which a physical system in a symmetric state spontaneously ends up in an asymmetric state. In particular, it can describe systems where the equations of motion or the Lagrangian obey symmetries, but the lowest-energy vacuum solutions do not exhibit that same symmetry. When the system goes to one of those vacuum solutions, the symmetry is broken for perturbations around that vacuum even though the entire Lagrangian retains that symmetry.