Repository logo

Conditioning graphs: practical structures for inference in bayesian networks

dc.contributor.advisorHorsch, Michael C.en_US
dc.contributor.committeeMemberSantos, Eugene Jr.en_US
dc.contributor.committeeMemberNeufeld, Ericen_US
dc.contributor.committeeMemberKeil, J. Marken_US
dc.contributor.committeeMemberGrassmann, Winfried K.en_US
dc.contributor.committeeMemberBickis, Mikelis G.en_US
dc.creatorGrant, Kevin Johnen_US 2007en_US
dc.description.abstractProbability is a useful tool for reasoning when faced with uncertainty. Bayesian networks offer a compact representation of a probabilistic problem, exploiting independence amongst variables that allows a factorization of the joint probability into much smaller local probability distributions.The standard approach to probabilistic inference in Bayesian networks is to compile the graph into a join­tree, and perform computation over this secondary structure. While join­trees are among the most time­efficient methods of inference in Bayesian networks, they are not always appropriate for certain applications. The memory requirements of join­tree can be prohibitively large. The algorithms for computing over join­trees are large and involved, making them difficult to port to other systems or be understood by general programmers without Bayesian network expertise. This thesis proposes a different method for probabilistic inference in Bayesian networks. We present a data structure called a conditioning graph, which is a run­time representation of Bayesian network inference. The structure mitigates many of the problems of join­tree inference. For example, conditioning graphs require much less space to store and compute over. The algorithm for calculating probabilities from a conditioning graph is small and basic, making it portable to virtually any architecture. And the details of Bayesian network inference are compiled away during the construction of the conditioning graph, leaving an intuitive structure that is easy to understand and implement without any Bayesian network expertise. In addition to the conditioning graph architecture, we present several improvements to the model, that maintain its small and simplistic style while reducing the runtime required for computing over it. We present two heuristics for choosing variable orderings that result in shallower elimination trees, reducing the overall complexity of computing over conditioning graphs. We also demonstrate several compile and runtime extensions to the algorithm, that can produce substantial speedup to the algorithm while adding a small space constant to the implementation. We also show how to cache intermediate values in conditioning graphs during probabilistic computation, that allows conditioning graphs to perform at the same speed as standard methods by avoiding duplicate computation, at the price of more memory. The methods presented also conform to the basic style of the original algorithm. We demonstrate a novel technique for reducing the amount of required memory for caching. We demonstrate empirically the compactness, portability, and ease of use of conditioning graphs. We also show that the optimizations of conditioning graphs allow competitive behaviour with standard methods in many circumstances, while still preserving its small and simple style. Finally, we show that the memory required under caching can be quite modest, meaning that conditioning graphs can be competitive with standard methods in terms of time, using a fraction of the memory.en_US
dc.subjectGraphical Modelsen_US
dc.subjectRecursive Conditioningen_US
dc.subjectBayesian Networksen_US
dc.titleConditioning graphs: practical structures for inference in bayesian networksen_US
dc.type.materialtexten_US Scienceen_US Scienceen_US of Saskatchewanen_US of Philosophy (Ph.D.)en_US


Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
1.06 MB
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
905 B
Plain Text