Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
The Web became the central medium for valuable sources of information fusion applications. However, such user-generated resources are often plagued by inaccuracies and misinformation as a result of the inherent openness and uncertainty of the Web. While finding objective data is non-trivial, assessing their credibility with a high confidence is even harder due to the conflicts of information between Web sources. In this work, we consider the novel setting of fusing factual data from the Web with a credibility guarantee and maximal recall. The ultimate goal is that not only the information should be extracted as much as possible but also its credibility must satisfy a threshold requirement. To this end, we formulate the problem of instantiating a maximal set of factual information such that its precision is larger than a pre-defined threshold. Our proposed approach is a learning process to optimize the parameters of a probabilistic model that captures the relationships between data sources, their contents, and the underlying factual information. The model automatically searches for best parameters without pre-trained data. Upon convergence, the parameters are used to instantiate as much as factual information with a precision guarantee. Our evaluations of real-world datasets show that our approach outperforms the baselines up to 6 times.
,
Pierre Dillenbourg, Barbara Bruno, Jauwairia Nasir