**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Separation of variables

Summary

In mathematics, separation of variables (also known as the Fourier method) is any of several methods for solving ordinary and partial differential equations, in which algebra allows one to rewrite an equation so that each of two variables occurs on a different side of the equation.
A differential equation for the unknown will be separable if it can be written in the form
where and are given functions. This is perhaps more transparent when written using as:
So now as long as h(y) ≠ 0, we can rearrange terms to obtain:
where the two variables x and y have been separated. Note dx (and dy) can be viewed, at a simple level, as just a convenient notation, which provides a handy mnemonic aid for assisting with manipulations. A formal definition of dx as a differential (infinitesimal) is somewhat advanced.
Those who dislike Leibniz's notation may prefer to write this as
but that fails to make it quite as obvious why this is called "separation of variables". Integrating both sides of the equation with respect to , we have
or equivalently,
because of the substitution rule for integrals.
If one can evaluate the two integrals, one can find a solution to the differential equation. Observe that this process effectively allows us to treat the derivative as a fraction which can be separated. This allows us to solve separable differential equations more conveniently, as demonstrated in the example below.
(Note that we do not need to use two constants of integration, in equation () as in
because a single constant is equivalent.)
Population growth is often modeled by the "logistic" differential equation
where is the population with respect to time , is the rate of growth, and is the carrying capacity of the environment.
Separation of variables now leads to
which is readily integrated using partial fractions on the left side yielding
where A is the constant of integration. We can find in terms of at t=0. Noting we get
Much like one can speak of a separable first-order ODE, one can speak of a separable second-order, third-order or nth-order ODE.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (17)

Separation of variables

In mathematics, separation of variables (also known as the Fourier method) is any of several methods for solving ordinary and partial differential equations, in which algebra allows one to rewrite an equation so that each of two variables occurs on a different side of the equation. A differential equation for the unknown will be separable if it can be written in the form where and are given functions. This is perhaps more transparent when written using as: So now as long as h(y) ≠ 0, we can rearrange terms to obtain: where the two variables x and y have been separated.

Ordinary differential equation

In mathematics, an ordinary differential equation (ODE) is a differential equation (DE) dependent on only a single independent variable. As with other DE, its unknown(s) consists of one (or more) function(s) and involves the derivatives of those functions. The term "ordinary" is used in contrast with partial differential equations which may be with respect to one independent variable. A linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form where a_0(x), .

Eigenvalues and eigenvectors

In linear algebra, an eigenvector (ˈaɪgənˌvɛktər) or characteristic vector of a linear transformation is a nonzero vector that changes at most by a constant factor when that linear transformation is applied to it. The corresponding eigenvalue, often represented by , is the multiplying factor. Geometrically, a transformation matrix rotates, stretches, or shears the vectors it acts upon. The eigenvectors for a linear transformation matrix are the set of vectors that are only stretched, with no rotation or shear.

Related MOOCs (2)

Neuronal Dynamics - Computational Neuroscience of Single Neurons

The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.

Neuronal Dynamics - Computational Neuroscience of Single Neurons

The activity of neurons in the brain and the code used by these neurons is described by mathematical neuron models at different levels of detail.

Related courses (24)

EE-548: Audio engineering

This lecture is oriented towards the study of audio engineering, with a special focus on room acoustics applications. The learning outcomes will be the techniques for microphones and loudspeaker desig

CH-250: Mathematical methods in chemistry

This course consists of two parts. The first part covers basic concepts of molecular symmetry and the application of group theory to describe it. The second part introduces Laplace transforms and Four

PHYS-216: Mathematical methods for physicists

This course complements the Analysis and Linear Algebra courses by providing further mathematical background and practice required for 3rd year physics courses, in particular electrodynamics and quant

Related lectures (189)

Separable Differential Equations

Covers separable differential equations of order 1, defining separable equations and providing examples.

Cauchy Problem: Existence and UniquenessMATH-301: Ordinary differential equations

Explores the Cauchy problem, focusing on the existence and uniqueness of solutions.

MathDetour 1: Separation of time scalesMOOC: Neuronal Dynamics - Computational Neuroscience of Single Neurons

Explores the concept of separation of time scales in computational neuroscience and the reduction of detail in two-dimensional neuron models.