Covers the principles of synchronization in parallel computing, focusing on shared memory synchronization and different methods like locks and barriers.
Explores memory consistency, coherence, weak consistency, and sequential consistency, emphasizing the importance of language-level consistency and data race-free programming.
Explores transactional memory and hardware simplification for concurrency control in software, emphasizing the benefits of hardware speculation and declarative concurrency.
Covers the basics of parallel programming, including concurrency, forms of parallelism, synchronization, and programming models like PThreads and OpenMP.