As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Exact inference in Bayesian networks with nodes having a large parent set is not tractable using standard techniques as are the junction tree method or the variable elimination. However, in many applications, the conditional probability tables of these nodes have certain local structure than can be exploited to make the exact inference tractable. In this paper we combine the CP tensor decomposition of probability tables with probabilistic inference using weighted model counting. The motivation for this combination is to exploit not only the local structure of some conditional probability tables but also other structural information potentialy present in the Baysian network, like determinism or context specific independence. We illustrate the proposed combination on BN2T networks – two-layered Bayesian networks with conditional probability tables representing noisy threshold models.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.