A t-distribution based operator for enhancing out of distribution robustness of neural network classifiers
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
The maximal achievable advantage of a (computationally unbounded) distinguisher to determine whether a source Z is distributed according to distribution P0 or P1, when given access to one sample of Z, is characterized by the statistical distance ...
Human vision has evolved to make sense of a world in which elements almost never appear in isolation. Surprisingly, the recognition of an element in a visual scene is strongly limited by the presence of other nearby elements, a phenomenon known as visual c ...
This work proposes a new way of combining independently trained classifiers over space and time. Combination over space means that the outputs of spatially distributed classifiers are aggregated. Combination over time means that the classifiers respond to ...
The relationship between simulated ion cyclotron emission (ICE) signals s and the corresponding 1D velocity distribution function f(upsilon(perpendicular to)) of the fast ions triggering the ICE is modeled using a two-layer deep neural network. The network ...
We describe a current propagation type return stroke model which is consistent with the estimated distribution of the charge on the leader channel. The model takes into account the dispersion of the return stroke current along the return stroke channel. Th ...
We derive generalization and excess risk bounds for neural networks using a family of complexity measures based on a multilevel relative entropy. The bounds are obtained by introducing the notion of generated hierarchical coverings of neural networks and b ...
Training models that perform well under distribution shifts is a central challenge in machine learning. In this paper, we introduce a modeling framework where, in addition to training data, we have partial structural knowledge of the shifted test distribut ...
The early time high altitude electromagnetic pulse conducted environment calculation is revisited by using the transmission line (TL) theory and including high-frequency corrections that were not present in the earlier studies. Waveform salient parameters, ...
p>We study the dynamics of optimization and the generalization properties of one-hidden layer neural networks with quadratic activation function in the overparametrized regime where the layer width m is larger than the input dimension d. We conside ...
Deep neural networks have been empirically successful in a variety of tasks, however their theoretical understanding is still poor. In particular, modern deep neural networks have many more parameters than training data. Thus, in principle they should over ...