Publication

Retroactive Packet Sampling for Traffic Receipts

Abstract

Is it possible to design a packet-sampling algorithm . that prevents the network node that performs the sampling from treating the sampled packets preferentially? We study this problem in the context of designing a "network transparency" system. In this system, networks emit receipts for a small sample of the packets they observe, and a monitor collects these receipts to estimate each network's loss and delay performance. Sampling is a good building block for this system, because it enables a solution that is flexible and combines low resource cost with quantifiable accuracy. The challenge is cheating resistance: when a network's performance is assessed based on the conditions experienced by a small traffic sample, the network has a strong incentive to treat the sampled packets better than the rest. We contribute a sampling algorithm that is provably robust to such prioritization attacks, enables network performance estimation with quantifiable accuracy, and requires minimal resources. We confirm our analysis using real traffic traces.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related concepts (32)
Sampling (statistics)
In statistics, quality assurance, and survey methodology, sampling is the selection of a subset or a statistical sample (termed sample for short) of individuals from within a statistical population to estimate characteristics of the whole population. Statisticians attempt to collect samples that are representative of the population. Sampling has lower costs and faster data collection compared to recording data from the entire population, and thus, it can provide insights in cases where it is infeasible to measure an entire population.
Cluster sampling
In statistics, cluster sampling is a sampling plan used when mutually homogeneous yet internally heterogeneous groupings are evident in a statistical population. It is often used in marketing research. In this sampling plan, the total population is divided into these groups (known as clusters) and a simple random sample of the groups is selected. The elements in each cluster are then sampled. If all elements in each sampled cluster are sampled, then this is referred to as a "one-stage" cluster sampling plan.
Simple random sample
In statistics, a simple random sample (or SRS) is a subset of individuals (a sample) chosen from a larger set (a population) in which a subset of individuals are chosen randomly, all with the same probability. It is a process of selecting a sample in a random way. In SRS, each subset of k individuals has the same probability of being chosen for the sample as any other subset of k individuals. A simple random sample is an unbiased sampling technique. Simple random sampling is a basic type of sampling and can be a component of other more complex sampling methods.
Show more
Related publications (72)

Comparison of two methods for bioaerosol sampling and characterization in a low-biomass chamber environment

Dusan Licina, Shen Yang, Marouane Merizak, Akila Muthalagu

Bioaerosols are emitted from various sources into the indoor environment and can positively and negatively impact human health. Humans are the major source of bioaerosol emissions indoors, specifically for bacteria. However, efficient sampling to guarantee ...
PERGAMON-ELSEVIER SCIENCE LTD2023

Center-aware Adversarial Augmentation for Single Domain Generalization

Mathieu Salzmann, Zhiye Wang

Domain generalization (DG) aims to learn a model from multiple training (i.e., source) domains that can generalize well to the unseen test (i.e., target) data coming from a different distribution. Single domain generalization (SingleDG) has recently emerge ...
IEEE COMPUTER SOC2023

Epigenome-wide DNA methylation in externalizing behaviours: A review and combined analysis

Maria del Carmen Sandi Perez, Mandy Meijer

DNA methylation (DNAm) is one of the most frequently studied epigenetic mechanisms facilitating the interplay of genomic and environmental factors, which can contribute to externalizing behaviours and related psychiatric disorders. Previous epigenome-wide ...
PERGAMON-ELSEVIER SCIENCE LTD2023
Show more
Related MOOCs (10)
Path Integral Methods in Atomistic Modelling
The course provides an introduction to the use of path integral methods in atomistic simulations. The path integral formalism allows to introduce quantum mechanical effects on the equilibrium and (ap
Path Integral Methods in Atomistic Modelling
The course provides an introduction to the use of path integral methods in atomistic simulations. The path integral formalism allows to introduce quantum mechanical effects on the equilibrium and (ap
Digital Signal Processing I
Basic signal processing concepts, Fourier analysis and filters. This module can be used as a starting point or a basic refresher in elementary DSP
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.