Posez n’importe quelle question sur les cours, conférences, exercices, recherches, actualités, etc. de l’EPFL ou essayez les exemples de questions ci-dessous.
AVERTISSEMENT : Le chatbot Graph n'est pas programmé pour fournir des réponses explicites ou catégoriques à vos questions. Il transforme plutôt vos questions en demandes API qui sont distribuées aux différents services informatiques officiellement administrés par l'EPFL. Son but est uniquement de collecter et de recommander des références pertinentes à des contenus que vous pouvez explorer pour vous aider à répondre à vos questions.
Stochastic gradient descent (SGD) and randomized coordinate descent (RCD) are two of the workhorses for training modern automated decision systems. Intriguingly, convergence properties of these methods are not well-established as we move away from the spec ...
Semi-discrete optimal transport problems, which evaluate the Wasserstein distance between a discrete and a generic (possibly non-discrete) probability measure, are believed to be computationally hard. Even though such problems are ubiquitous in statistics, ...
Many important problems in contemporary machine learning involve solving highly non- convex problems in sampling, optimization, or games. The absence of convexity poses significant challenges to convergence analysis of most training algorithms, and in some ...
The numerical solution of the stepped pressure equilibrium (Hudson et al 2012 Phys. Plasmas 19 112502) requires a fast and robust solver to obtain the Beltrami field in three-dimensional geometry such as stellarators. The spectral method implemented in the ...
We study asymmetric zero-range processes on Z with nearest-neighbour jumps and site disorder. The jump rate of particles is an arbitrary but bounded nondecreasing function of the number of particles. We prove quenched strong local equilibrium at subcritica ...
General principles of quantum field theory imply that there exists an operator product expansion (OPE) for Wightman functions in Minkowski momentum space that converges for arbitrary kinematics. This convergence is guaranteed to hold in the sense of a dist ...
We examine the almost-sure asymptotics of the solution to the stochastic heat equation driven by a Levy space-time white noise. When a spatial point is fixed and time tends to infinity, we show that the solution develops unusually high peaks over short tim ...
We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) don’t increase the stepsize too fast and 2) don’t overstep the local curvature. No need for functional values, no line search, no information about the func ...
This paper analyzes the trajectories of stochastic gradient descent (SGD) to help understand the algorithm’s convergence properties in non-convex problems. We first show that the sequence of iterates generated by SGD remains bounded and converges with prob ...
It is well known and readily seen that the maximum of n independent and uniformly on [0, 1] distributed random variables, suitably standardised, converges in total variation distance, as n increases, to the standard negative exponential distribution. We ex ...