Back to overview

CADA: Collaborative Auditing for Distributed Aggregation

Type of publication Peer-reviewed
Publikationsform Proceedings (peer-reviewed)
Publication date 2012
Author Valerio José, Felber Pascal, Rajman Martin, Rivière Etienne,
Project MistNet: An Experimental Peer-to-peer Platform for the Cloud
Show all

Proceedings (peer-reviewed)

Title of proceedings 9th European Dependable Computing Conference (EDCC)
Place Sibiu, Romania


The aggregation of distributions, composed of the number of occurrences of each element in a set, is an operation that lies at the heart of several large-scale distributed applications. Examples include popularity tracking, recommendation systems, trust management, or popularity measurement mechanisms. These applications typically span multiple administrative domains that do not trust each other and are sensitive to biases in the distribution aggregation: the results can only be trusted if inserted values were not altered nor forged, and if nodes collecting the insertions do not arbitrarily modify the aggregation results. In order to increase the level of trust that can be granted to applications, there must be a disincentive for servers to bias the aggregation results. In this paper we present the CADA auditing mechanisms that let aggregation servers collaboratively and periodically audit one another based on probabilistic tests over server-local state. CADA differs from the existing work on accountability in that it leverages the nature of the operation being performed by the node rather than a general and application-oblivious model of the computation. The effectiveness of CADA is conveyed by an experimental evaluation that studies its ability to detect malevolent behaviors using lightweight auditing oracles.