Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers topics related to Optimal Power Flow (OPF) solutions, including Semidefinite Programming (SDP), Non-convex problems, and Gradient Descent methods. It also introduces Chebyshev's inequality and its applications in probability distributions and Concentration Inequalities. The instructor, Bahar Taskesen, discusses the optimization of probability distributions using SDP and the derivation of Chebyshev's inequality. The lecture concludes with a numerical example demonstrating the estimation of system safety probabilities and the use of SDP for probability bounding.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace