Command Palette
Search for a command to run...
MITS: Enhanced Tree Search Reasoning for LLMs via Pointwise Mutual Information
Jiaxi Li Yucheng Shi Jin Lu Ninghao Liu

Abstract
Tree search has become as a representative framework for test-time reasoningwith large language models (LLMs), exemplified by methods such asTree-of-Thought and Monte Carlo Tree Search that explore multiple reasoningpaths. However, it remains difficult to provide instant and reliablequantitative assessments of intermediate reasoning step quality, and extensivepath exploration is computationally costly. To address this, we propose MutualInformation Tree Search (MITS), a novel framework that guides reasoning withinformation-theoretic principles. MITS introduces an effective scoring functionbased on pointwise mutual information (PMI), which enables step-wise evaluationof reasoning paths and search tree expansion via beam search without expensivelook-ahead simulations, achieving superior reasoning performances whilemaintaining computational efficiency. The framework is complemented by anentropy-based dynamic sampling strategy that adaptively allocates computationalresources to uncertain reasoning steps where exploration is most beneficial.For final prediction, MITS employs a weighted voting scheme that combines PMIscores with prediction consensus. Through comprehensive experiments on diversereasoning benchmarks, MITS consistently surpasses baseline methods,establishing a principled and efficient framework for LLM reasoning.
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.