Parallelizing A Convergent Approximate Inference Method

Tech Report Number
576

 

Abstract

Probabilistic inference in graphical models is a prevalent task in statistics and artificial intelligence. The ability to perform this inference task efficiently is critical in large scale applications. The ever-evolving parallel computing technology suggests that dramatic speedup might be achieved by appropriately mapping the current inference algorithms to a parallel framework. Parallel exact inference methods still suffer from worst-case exponential complexity. Approximate inference methods have been parallelized and good speedup achieved. In this report, we focus on a variant of the Belief Propagation algorithm. This variant has better convergence properties and is provably convergent under certain condition. We show that this method is amenable to coarse-grained parallelization and propose techniques to parallelize it optimally without sacrificing convergence. Experiments on a shared memory system demonstrate that near-ideal speedup is achieved with reasonable scalability. This report is based on a paper by Ming Su, submitted to UAI, 2010.

 

File
tr576.pdf302.75 KB
Published Date