Lecture

Optimal Power Flow and Chebyshev Introduction

Description

This lecture covers topics related to Optimal Power Flow (OPF) solutions, including Semidefinite Programming (SDP), Non-convex problems, and Gradient Descent methods. It also introduces Chebyshev's inequality and its applications in probability distributions and Concentration Inequalities. The instructor, Bahar Taskesen, discusses the optimization of probability distributions using SDP and the derivation of Chebyshev's inequality. The lecture concludes with a numerical example demonstrating the estimation of system safety probabilities and the use of SDP for probability bounding.

This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.

Watch on Mediaspace
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.