TY - GEN
T1 - On the asymptotic scalability of the consensus algorithm
AU - Yildiz, Mehmet E.
AU - Scaglione, Anna
PY - 2007/12/1
Y1 - 2007/12/1
N2 - Average consensus algorithms are gossiping protocols for averaging original measurements taken at different sensors. Without any communication rate restrictions, the algorithms ideally allow every node state to converge to the initial average after some iterations. Noting that brute force quantization is highly suboptimal given the rich temporal and spatial correlation of the messages exchanged, in our previous work we proposed two source coding methods, predictive coding and Wyner-Ziv coding which achieve convergence with vanishing quantization rates in the case of block coding. Both methods ideally require complete information about the network parameters and topology as well as processing and storage of the past state values. The knowledge of the network parameters is not a practical requirement, especially considering that one is implementing average consensus algorithms that are decentralized. In this study we show that as the node density increases or in homogeneously distributed networks, the encoder and decoder parameters become independent on the network size and specific location. We also lower bound the error performance of the predictive coding scheme in terms of eigen-values of the connectivity and initial state covariance matrices. We show the relation between MSE behavior of the algorithm and network connectivity, as well as quantization rate.
AB - Average consensus algorithms are gossiping protocols for averaging original measurements taken at different sensors. Without any communication rate restrictions, the algorithms ideally allow every node state to converge to the initial average after some iterations. Noting that brute force quantization is highly suboptimal given the rich temporal and spatial correlation of the messages exchanged, in our previous work we proposed two source coding methods, predictive coding and Wyner-Ziv coding which achieve convergence with vanishing quantization rates in the case of block coding. Both methods ideally require complete information about the network parameters and topology as well as processing and storage of the past state values. The knowledge of the network parameters is not a practical requirement, especially considering that one is implementing average consensus algorithms that are decentralized. In this study we show that as the node density increases or in homogeneously distributed networks, the encoder and decoder parameters become independent on the network size and specific location. We also lower bound the error performance of the predictive coding scheme in terms of eigen-values of the connectivity and initial state covariance matrices. We show the relation between MSE behavior of the algorithm and network connectivity, as well as quantization rate.
KW - Communication systems
KW - Distributed algorithms
KW - Networks
UR - http://www.scopus.com/inward/record.url?scp=47849100425&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=47849100425&partnerID=8YFLogxK
U2 - 10.1109/SSP.2007.4301338
DO - 10.1109/SSP.2007.4301338
M3 - Conference contribution
AN - SCOPUS:47849100425
SN - 142441198X
SN - 9781424411986
T3 - IEEE Workshop on Statistical Signal Processing Proceedings
SP - 645
EP - 649
BT - 2007 IEEE/SP 14th Workshop on Statistical Signal Processing, SSP 2007, Proceedings
T2 - 2007 IEEE/SP 14th WorkShoP on Statistical Signal Processing, SSP 2007
Y2 - 26 August 2007 through 29 August 2007
ER -