Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Invariant Distributions: Markov Chains
Graph Chatbot
Related lectures (32)
Previous
Page 2 of 4
Next
Markov Chains: Theory and Applications
Covers the theory and applications of Markov chains in modeling random phenomena and decision-making under uncertainty.
Markov Chains and Algorithm Applications
Covers Markov chains and their applications in algorithms, focusing on Markov Chain Monte Carlo sampling and the Metropolis-Hastings algorithm.
Stochastic Simulation: Theory of Markov Chains
Covers the theory of Markov chains, focusing on reversible chains and detailed balance.
Markov Chains: Ergodicity and Stationary Distribution
Explores ergodicity and stationary distribution in Markov chains, emphasizing convergence properties and unique distributions.
Limiting Distribution and Ergodic Theorem
Explores limiting distribution in Markov chains and the implications of ergodicity and aperiodicity on stationary distributions.
Ergodic Theorem: Proof and Applications
Explains the proof of the ergodic theorem and the concept of positive-recurrence in Markov chains.
Recurrence and Transience: Markov Chains
Explores recurrence and transience in Markov chains, discussing the strong Markov property and state classifications.
Markov Chains: PageRank Algorithm
Explores the PageRank algorithm within Markov chains, emphasizing ergodicity and convergence for web page ranking.
MCMC Examples and Error Estimation
Covers Markov Chain Monte Carlo examples and error estimation methods.
Stochastic Processes: Time Reversal
Explores time reversal in stationary Markov chains and the concept of detailed balance conditions.