Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the principles of parallel computing, focusing on the importance of parallelism in achieving higher performance. It discusses the challenges of the multicore era, the execution models, and the communication models such as shared memory, message passing, and data parallelism. The instructor explains the OpenMP framework, providing examples of parallelizing code using compiler directives. The lecture emphasizes the concepts of shared vs. private variables, synchronization mechanisms like critical sections and barriers, and the execution model of OpenMP. It concludes with a summary of OpenMP as a tool for creating concurrent code from serial code, highlighting its scalability and portability.