**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.

Publication# Affine Models

Abstract

A fine term structure models have gained significant attention in the finance literature, mainly due to their analytical tractability and statistical flexibility. The aim of this article is to present both theoretical foundations as well as empirical aspects of the affine model class. Starting from the original one-factor short-rate models of Vasicek and Cox et al, we provide an overview of the properties of regular affine processes and explain their relationship to affine term structure models. Methods for securities pricing and for parameter estimation are also discussed, demonstrating how the analytical tractability of affine models can be exploited for practical purposes.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related MOOCs (8)

Related publications (46)

Related concepts (32)

Ontological neighbourhood

Digital Signal Processing I

Basic signal processing concepts, Fourier analysis and filters. This module can
be used as a starting point or a basic refresher in elementary DSP

Digital Signal Processing II

Adaptive signal processing, A/D and D/A. This module provides the basic
tools for adaptive filtering and a solid mathematical framework for sampling and
quantization

Digital Signal Processing III

Advanced topics: this module covers real-time audio processing (with
examples on a hardware board), image processing and communication system design.

Maximum likelihood estimation

In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.

Estimation theory

Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.

Asset pricing

In financial economics, asset pricing refers to a formal treatment and development of two main pricing principles, outlined below, together with the resultant models. There have been many models developed for different situations, but correspondingly, these stem from either general equilibrium asset pricing or rational asset pricing, the latter corresponding to risk neutral pricing.

We study the problem of learning unknown parameters of stochastic dynamical models from data. Often, these models are high dimensional and contain several scales and complex structures. One is then interested in obtaining a reduced, coarse-grained descript ...

We introduce the elliptical Ornstein-Uhlenbeck (OU) process, which is a generalisation of the well-known univariate OU process to bivariate time series. This process maps out elliptical stochastic oscillations over time in the complex plane, which are obse ...

Werner Alfons Hilda Van Geit, Oren Amsalem, Idan Segev

Many scientific systems are studied using computer codes that simulate the phenomena of interest. Computer simulation enables scientists to study a broad range of possible conditions, generating large quantities of data at a faster rate than the laboratory ...