Reducing Noise in GAN Training with Variance Reduced Extragradient
Related publications (37)
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
Within the context of contemporary machine learning problems, efficiency of optimization process depends on the properties of the model and the nature of the data available, which poses a significant problem as the complexity of either increases ad infinit ...
This paper introduces Wireless IoT-based Noise Cancellation (WINC) which defines a framework for leveraging a wireless network of IoT microphones to enhance active noise cancellation in noise-canceling headphones. The IoT microphones forward ambient noise ...
Stochastic gradient descent (SGD) and randomized coordinate descent (RCD) are two of the workhorses for training modern automated decision systems. Intriguingly, convergence properties of these methods are not well-established as we move away from the spec ...
EPFL2021
, ,
In the field of choice modeling, the availability of ever-larger datasets has the potential to significantly expand our understanding of human behavior, but this prospect is limited by the poor scalability of discrete choice models (DCMs): as sample sizes ...
2023
,
The automatic design of well-performing robotic controllers is still an unsolved problem due to the inherently large parameter space and noisy, often hard-to-define performance metrics, especially when sequential tasks need to be accomplished. Distal contr ...
We study the performance of Stochastic Cubic Regularized Newton (SCRN) on a class of functions satisfying gradient dominance property with 1≤α≤2 which holds in a wide range of applications in machine learning and signal processing. This conditio ...
Understanding the implicit bias of training algorithms is of crucial importance in order to explain the success of overparametrised neural networks. In this paper, we study the dynamics of stochastic gradient descent over diagonal linear networks through i ...
In this paper we tackle the challenge of making the stochastic coordinate descent algorithm differentially private. Compared to the classical gradient descent algorithm where updates operate on a single model vector and controlled noise addition to this ve ...
AIAA2021
, , ,
Deep learning networks are typically trained by Stochastic Gradient Descent (SGD) methods that iteratively improve the model parameters by estimating a gradient on a very small fraction of the training data. A major roadblock faced when increasing the batc ...
2020
Environmental noise, mostly related to human activities, has an immense impact on public health. The development of noise reduction technologies is paramount in addressing this problem. Because of practical and economic reasons, a compact, broadband, light ...