DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jang, Hyeryung | ko |
dc.contributor.author | Song, Hyungseok | ko |
dc.contributor.author | Yi, Yung | ko |
dc.date.accessioned | 2022-06-26T01:01:28Z | - |
dc.date.available | 2022-06-26T01:01:28Z | - |
dc.date.created | 2022-02-06 | - |
dc.date.created | 2022-02-06 | - |
dc.date.issued | 2022-06 | - |
dc.identifier.citation | IEEE-ACM TRANSACTIONS ON NETWORKING, v.30, no.3, pp.1382 - 1394 | - |
dc.identifier.issn | 1063-6692 | - |
dc.identifier.uri | http://hdl.handle.net/10203/297074 | - |
dc.description.abstract | In this paper, we consider the problem of learning a tree graph structure that represents the statistical data dependency among nodes for a set of data samples generated by nodes, which provides the basic structure to perform a probabilistic inference task. Inference in the data graph includes marginal inference and maximum a posteriori (MAP) estimation, and belief propagation (BP) is a commonly used algorithm to compute the marginal distribution of nodes via message-passing, incurring non-negligible amount of communication cost. We inevitably have the trade-off between the inference accuracy and the message-passing cost because the learned structure of data dependency and physical connectivity graph are often highly different. In this paper, we formalize this trade-off in an optimization problem which outputs the data dependency graph that jointly considers learning accuracy and message-passing costs. We focus on two popular implementations of BP, ASYNC-BP and SYNC-BP, which have different message-passing mechanisms and cost structures. In ASYNC-BP, we propose a polynomial-time learning algorithm that is optimal, motivated by finding a maximum weight spanning tree of a complete graph. In SYNC-BP, we prove the NP-hardness of the problem and propose a greedy heuristic. For both BP implementations, we quantify how the error probability that the learned cost-efficient data graph differs from the ideal one decays as the number of data samples grows, using the large deviation principle, which provides a guideline on how many samples are necessary to obtain a certain trade-off. We validate our theoretical findings through extensive simulations, which confirms that it has a good match. | - |
dc.language | English | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.title | On Cost-Efficient Learning of Data Dependency | - |
dc.type | Article | - |
dc.identifier.wosid | 000745465600001 | - |
dc.identifier.scopusid | 2-s2.0-85123358934 | - |
dc.type.rims | ART | - |
dc.citation.volume | 30 | - |
dc.citation.issue | 3 | - |
dc.citation.beginningpage | 1382 | - |
dc.citation.endingpage | 1394 | - |
dc.citation.publicationname | IEEE-ACM TRANSACTIONS ON NETWORKING | - |
dc.identifier.doi | 10.1109/TNET.2022.3141128 | - |
dc.contributor.localauthor | Yi, Yung | - |
dc.contributor.nonIdAuthor | Jang, Hyeryung | - |
dc.description.isOpenAccess | N | - |
dc.type.journalArticle | Article | - |
dc.subject.keywordAuthor | Costs | - |
dc.subject.keywordAuthor | Distributed databases | - |
dc.subject.keywordAuthor | Inference algorithms | - |
dc.subject.keywordAuthor | Graphical models | - |
dc.subject.keywordAuthor | Task analysis | - |
dc.subject.keywordAuthor | Data models | - |
dc.subject.keywordAuthor | Tree graphs | - |
dc.subject.keywordAuthor | Graph structure learning | - |
dc.subject.keywordAuthor | distributed inference | - |
dc.subject.keywordAuthor | sample complexity | - |
dc.subject.keywordAuthor | large deviation principle | - |
dc.subject.keywordAuthor | belief propagation | - |
dc.subject.keywordPlus | BELIEF-PROPAGATION | - |
dc.subject.keywordPlus | DISTRIBUTED INFERENCE | - |
dc.subject.keywordPlus | TRACKING | - |
dc.subject.keywordPlus | PRODUCT | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.