{"title":"随机化协议的子线性消息边界","authors":"John E. Augustine, A. R. Molla, Gopal Pandurangan","doi":"10.1145/3212734.3212751","DOIUrl":null,"url":null,"abstract":"This paper focuses on understanding the message complexity of randomized agreement in synchronous distributed networks. We focus on the so-called implicit agreement problem where each node starts with an input value (0 or 1) and at the end one or more nodes should decide on a common input value which should be equal to some node's input value (there can be undecided nodes). Implicit agreement is a generalization of the fundamental agreement and leader election problems. We present sublinear (in n, where n is the number of nodes) algorithms and lower bounds on the message complexity of implicit agreement in fully-connected (i.e., complete) networks. Specifically our main results are: We show that for any implicit agreement algorithm that succeeds with probability at least 1 - ε, for some suitably small constant ε > 0, needs at least Ω(n0.5) messages with constant probability. This bound holds regardless of the number of rounds used and applies to both LOCAL and CONGEST models. This lower bound is essentially tight for complete networks, as there exists a randomized agreement algorithm that uses only Õ (n0.5) messages1 with high probability2 and runs inO(1) rounds and succeeds with high probability. Both the upper and lower bounds assume that nodes have access to (only) private coins. In contrast to the above bounds, if nodes have access to an unbiased global (shared) coin, we present a randomized algorithm which, with high probability, achieves implicit agreement, and uses Õ (n0.4) messages in expectation and runs in O(1) rounds (deterministically). This algorithm works in the CONGEST model as well. Our result shows the power of a global coin in significantly improving (by a polynomial factor) the message complexity of agreement. As another contrast, we show that the same benefit does not apply to leader election, i.e., even with access to a global coin, Ω(n0.5) messages (in expectation) are needed for any leader election algorithm that succeeds with probability at least 1 - ε, for a small constant ε > 0. We extend our results to a natural generalization of agreement called as subset agreement where a given (non-empty) subset of nodes should agree on a common value. We show that subset agreement on a subset of size k nodes can be accomplished by a randomized algorithm that succeeds with high probability, and uses (in expecation) Õ (min{kn0.5,n}) (using only private coins) and Õ (min{kn0.4,n}) messages (using global coin) respectively","PeriodicalId":198284,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2018-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Sublinear Message Bounds for Randomized Agreement\",\"authors\":\"John E. Augustine, A. R. Molla, Gopal Pandurangan\",\"doi\":\"10.1145/3212734.3212751\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper focuses on understanding the message complexity of randomized agreement in synchronous distributed networks. We focus on the so-called implicit agreement problem where each node starts with an input value (0 or 1) and at the end one or more nodes should decide on a common input value which should be equal to some node's input value (there can be undecided nodes). Implicit agreement is a generalization of the fundamental agreement and leader election problems. We present sublinear (in n, where n is the number of nodes) algorithms and lower bounds on the message complexity of implicit agreement in fully-connected (i.e., complete) networks. Specifically our main results are: We show that for any implicit agreement algorithm that succeeds with probability at least 1 - ε, for some suitably small constant ε > 0, needs at least Ω(n0.5) messages with constant probability. This bound holds regardless of the number of rounds used and applies to both LOCAL and CONGEST models. This lower bound is essentially tight for complete networks, as there exists a randomized agreement algorithm that uses only Õ (n0.5) messages1 with high probability2 and runs inO(1) rounds and succeeds with high probability. Both the upper and lower bounds assume that nodes have access to (only) private coins. In contrast to the above bounds, if nodes have access to an unbiased global (shared) coin, we present a randomized algorithm which, with high probability, achieves implicit agreement, and uses Õ (n0.4) messages in expectation and runs in O(1) rounds (deterministically). This algorithm works in the CONGEST model as well. Our result shows the power of a global coin in significantly improving (by a polynomial factor) the message complexity of agreement. As another contrast, we show that the same benefit does not apply to leader election, i.e., even with access to a global coin, Ω(n0.5) messages (in expectation) are needed for any leader election algorithm that succeeds with probability at least 1 - ε, for a small constant ε > 0. We extend our results to a natural generalization of agreement called as subset agreement where a given (non-empty) subset of nodes should agree on a common value. We show that subset agreement on a subset of size k nodes can be accomplished by a randomized algorithm that succeeds with high probability, and uses (in expecation) Õ (min{kn0.5,n}) (using only private coins) and Õ (min{kn0.4,n}) messages (using global coin) respectively\",\"PeriodicalId\":198284,\"journal\":{\"name\":\"Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3212734.3212751\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2018 ACM Symposium on Principles of Distributed Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3212734.3212751","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
This paper focuses on understanding the message complexity of randomized agreement in synchronous distributed networks. We focus on the so-called implicit agreement problem where each node starts with an input value (0 or 1) and at the end one or more nodes should decide on a common input value which should be equal to some node's input value (there can be undecided nodes). Implicit agreement is a generalization of the fundamental agreement and leader election problems. We present sublinear (in n, where n is the number of nodes) algorithms and lower bounds on the message complexity of implicit agreement in fully-connected (i.e., complete) networks. Specifically our main results are: We show that for any implicit agreement algorithm that succeeds with probability at least 1 - ε, for some suitably small constant ε > 0, needs at least Ω(n0.5) messages with constant probability. This bound holds regardless of the number of rounds used and applies to both LOCAL and CONGEST models. This lower bound is essentially tight for complete networks, as there exists a randomized agreement algorithm that uses only Õ (n0.5) messages1 with high probability2 and runs inO(1) rounds and succeeds with high probability. Both the upper and lower bounds assume that nodes have access to (only) private coins. In contrast to the above bounds, if nodes have access to an unbiased global (shared) coin, we present a randomized algorithm which, with high probability, achieves implicit agreement, and uses Õ (n0.4) messages in expectation and runs in O(1) rounds (deterministically). This algorithm works in the CONGEST model as well. Our result shows the power of a global coin in significantly improving (by a polynomial factor) the message complexity of agreement. As another contrast, we show that the same benefit does not apply to leader election, i.e., even with access to a global coin, Ω(n0.5) messages (in expectation) are needed for any leader election algorithm that succeeds with probability at least 1 - ε, for a small constant ε > 0. We extend our results to a natural generalization of agreement called as subset agreement where a given (non-empty) subset of nodes should agree on a common value. We show that subset agreement on a subset of size k nodes can be accomplished by a randomized algorithm that succeeds with high probability, and uses (in expecation) Õ (min{kn0.5,n}) (using only private coins) and Õ (min{kn0.4,n}) messages (using global coin) respectively