Publication

Input-dependent Regularization of Conditional Density Models

Matthias Seeger
2000
Report or working paper
Abstract

We emphasize the need for input-dependent regularization in the context of conditional density models (also: discriminative models) like Gaussian process predictors. This can be achieved by a simple modification of the standard Bayesian data generation model un- derlying these techniques. Specifically, we allow the latent target function to be a- priori dependent on the distribution of the input points. While the standard genera- tion model results in robust predictors, data with missing labels is ignored, which can be wasteful if relevant prior knowledge is avail- able. We show that discriminative mod- els like Fisher kernel discriminants and Co- Training classifiers can be regarded as (ap- proximate) Bayesian inference techniques un- der the modified generation model, and that the template Co-Training algorithm is related to a variant of the well-known Expectation- Maximization (EM) technique. We propose a template EM algorithm for the modified generation model which can be regarded as generalization of Co-Training.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.