Publication

Bounds on the Degree of High Order Binary Perceptrons

1996
Conference paper
Abstract

High order perceptrons are often used in order to reduce the size of neural networks. The complexity of the architecture of a usual multilayer network is then turned into the complexity of the functions performed by each high order unit and in particular by the degree of their polynomials. The main result of this paper provides a bound on the degree of the polynomial of a high order perceptron, when the binary training data result from the encoding of an arrangement of hyperplanes in the Euclidian space. Such a situation occurs naturally in the case of a feedforward network with a single hidden layer of first order perceptrons and an output layer of high order perceptrons. In this case, the result says that the degree of the high order perceptrons can be bounded by the minimum of the number of inputs and the number of hidden units.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.