Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture introduces the principles of parallel computing, emphasizing the importance of parallelism in achieving higher performance due to the limitations of single-core processors. It covers the basics of OpenMP, a high-level API for shared-memory programming, and provides examples of parallelizing code using OpenMP directives. The lecture discusses the concepts of division of work, communication, synchronization, and execution models in parallel programming. It also explores the different communication models such as shared memory, message passing, and data parallelism. The presentation includes detailed explanations and examples of shared vs. private variables, synchronization techniques like critical sections and barriers, and the use of OpenMP for creating concurrent code from serial code.