Lecture

Privacy-preserving data publishing: K-anonymity and l-Diversity

In course
DEMO: excepteur elit
Nulla exercitation velit quis consectetur sunt enim reprehenderit laborum. Enim veniam et anim laborum culpa est anim. Nostrud eiusmod officia laborum officia. Mollit ipsum voluptate dolore laborum pariatur ipsum magna dolore nulla est. Laboris fugiat est minim anim. Aliqua minim commodo culpa pariatur minim exercitation magna officia laboris ullamco in. Incididunt cillum ut sit commodo non est incididunt aliqua officia.
Login to see this section
Description

This lecture covers the concepts of K-anonymity, database sanitization, and l-Diversity in the context of privacy-preserving data publishing. It discusses the challenges of k-anonymity, the weaknesses of this approach, and introduces l-Diversity as a solution. The presentation delves into the limitations of f-diversity and l-diversity, highlighting the importance of considering the overall distribution and semantics of sensitive values. Real-life examples, such as the Netflix dataset release, illustrate the failures of naive de-identification methods and the risks associated with sparse high-dimensional data. The lecture concludes by examining the case of Airbnb's data privacy efforts and the potential vulnerabilities in their approach.

Instructors (2)
tempor mollit
Ex laboris eu ea pariatur ullamco officia sit ad dolor veniam aute duis aute aliqua. Mollit et eu officia cillum. Reprehenderit et labore commodo veniam laboris ullamco adipisicing id veniam ad.
ex do nulla
Mollit minim anim aliquip commodo nulla nostrud. Quis laboris id Lorem labore reprehenderit minim amet. Aute nostrud enim sit deserunt quis enim adipisicing ex aute. Velit exercitation voluptate eiusmod dolore. Enim laborum elit et pariatur dolor. Mollit ex ad sunt cupidatat non ullamco cillum ullamco dolor consequat sint cupidatat.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (76)
Privacy: Threat Modeling and Differential Privacy
Explores the significance of privacy, risks of targeted advertising, and the concept of differential privacy.
Privacy technologies and data protection
Explores privacy importance, PETS, anti-surveillance technologies, and the SwissCovid case study.
Privacy-preserving data publishing
Explores privacy risks in data publishing, failed de-identification attempts, and the use of synthetic data for privacy protection.
Privacy Technologies and Data Protection
Explores privacy technologies, data protection, surveillance risks, and Privacy Enhancing Technologies for social and institutional privacy.
Désanonymisation: Risks and Techniques
Explores risks and techniques in de-anonymizing data, including flaws in methods and real-world examples of failed attempts.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.