Handling probabilistic integrity constraints in pay-as-you-go reconciliation of data models
Related publications (44)
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
CLEF-HIPE-2020 (Identifying Historical People, Places and other Entities) is a evaluation campaign on named entity processing on historical newspapers in French, German and English, which was organized in the context of the impresso project and run as a CL ...
Independent modeling of various modules of an information system (IS), and consequently database subschemas, may result in formal or semantic conflicts between the modules being modeled. Such conflicts may cause collisions between the integrated database s ...
The machine tool data model of STEP-NC (ISO 14649) was conceived as a necessary extension to the original STEP-NC set of standards to make efficient control possible. The intention of this paper is to describe the background to the data model as well as re ...
This article investigates the evolution of data quality issues from traditional structured data managed in relational databases to Big Data. In particular, the paper examines the nature of the relationship between Data Quality and several research coordina ...
Industry and academia are continuously becoming more data-driven and data-intensive, relying on the analysis of a wide variety of datasets to gain insights. At the same time, data variety increases continuously across multiple axes. First, data comes in mu ...
The cities in which we live are constantly evolving. The active management of this evolution is referred to as urban planning. The according development process could go in many directions resulting in a large number of potential future scenarios of a city ...
The article investigates the potential role of conceptual modeling for policymaking. It argues that the use of conceptual schemas may provide an effective understanding of public sector information assets, and how they might be used to satisfy the needs of ...
In 2020, EPFL Library conducted a study about Tools and Metadata Standards practice in EPFL School of Life Sciences. By standard, we mean: - terminological resources (vocabularies, terminologies, classifications, thesauri), - formats and data models / sche ...
Recent years have seen an exponential increase in the amount of data available in all sciences and application domains. Macroecology is part of this "Big Data" trend, with a strong rise in the volume of data that we are using for our research. Here, we sum ...
Conceptual models such as database schemas, ontologies or process models have been established as a means for effective engineering of information systems. Yet, for complex systems, conceptual models are created by a variety of stakeholders, which calls fo ...