Conditioning graphs: practical structures for inference in bayesian networks
Grant, Kevin John
MetadataShow full item record
Probability is a useful tool for reasoning when faced with uncertainty. Bayesian networks offer a compact representation of a probabilistic problem, exploiting independence amongst variables that allows a factorization of the joint probability into much smaller local probability distributions.The standard approach to probabilistic inference in Bayesian networks is to compile the graph into a jointree, and perform computation over this secondary structure. While jointrees are among the most timeefficient methods of inference in Bayesian networks, they are not always appropriate for certain applications. The memory requirements of jointree can be prohibitively large. The algorithms for computing over jointrees are large and involved, making them difficult to port to other systems or be understood by general programmers without Bayesian network expertise. This thesis proposes a different method for probabilistic inference in Bayesian networks. We present a data structure called a conditioning graph, which is a runtime representation of Bayesian network inference. The structure mitigates many of the problems of jointree inference. For example, conditioning graphs require much less space to store and compute over. The algorithm for calculating probabilities from a conditioning graph is small and basic, making it portable to virtually any architecture. And the details of Bayesian network inference are compiled away during the construction of the conditioning graph, leaving an intuitive structure that is easy to understand and implement without any Bayesian network expertise. In addition to the conditioning graph architecture, we present several improvements to the model, that maintain its small and simplistic style while reducing the runtime required for computing over it. We present two heuristics for choosing variable orderings that result in shallower elimination trees, reducing the overall complexity of computing over conditioning graphs. We also demonstrate several compile and runtime extensions to the algorithm, that can produce substantial speedup to the algorithm while adding a small space constant to the implementation. We also show how to cache intermediate values in conditioning graphs during probabilistic computation, that allows conditioning graphs to perform at the same speed as standard methods by avoiding duplicate computation, at the price of more memory. The methods presented also conform to the basic style of the original algorithm. We demonstrate a novel technique for reducing the amount of required memory for caching. We demonstrate empirically the compactness, portability, and ease of use of conditioning graphs. We also show that the optimizations of conditioning graphs allow competitive behaviour with standard methods in many circumstances, while still preserving its small and simple style. Finally, we show that the memory required under caching can be quite modest, meaning that conditioning graphs can be competitive with standard methods in terms of time, using a fraction of the memory.
DegreeDoctor of Philosophy (Ph.D.)
SupervisorHorsch, Michael C.
CommitteeSantos, Eugene Jr.; Neufeld, Eric; Keil, J. Mark; Grassmann, Winfried K.; Bickis, Mikelis G.
Copyright DateJanuary 2007