Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Discrete-Time Markov Chains: Definitions
Graph Chatbot
Related lectures (32)
Previous
Page 2 of 4
Next
Probability & Stochastic Processes
Covers applied probability, stochastic processes, Markov chains, rejection sampling, and Bayesian inference methods.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Markov Chains: Theory and Applications
Covers the theory and applications of Markov chains in modeling random phenomena and decision-making under uncertainty.
Lindblad equation
Covers the interpretation of the Lindblad equation and its unitary part in quantum gases.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers definitions and state probabilities of continuous-time Markov chains for communications.
NISQ and IBM Q
Explores NISQ devices and IBM Q, covering noisy quantum circuits, qubit technologies, and quantum algorithm development.
Discrete-Time Markov Chains: Absorbing Chains Examples
Explores examples of absorbing chains in discrete-time Markov chains, focusing on transition probabilities.
Continuous-Time Markov Chains: Definitions and State Probabilities
Covers the definitions and state probabilities of continuous-time Markov chains.
Markov Chains: Ergodicity and Stationary Distribution
Explores ergodicity and stationary distribution in Markov chains, emphasizing convergence properties and unique distributions.
Expected Number of Visits in State
Covers the criterion for recurrence in infinite chains based on the expected number of visits in a state.