âNeuroLogic Aâ±esque Decoding: Constrained Text Generation With Lookahead Heuristicsâ, 2021-12-16 ()â :
The dominant paradigm for neural text generation is left-to-right decoding from autoregressive language models. Constrained or controllable generation under complex lexical constraints, however, requires foresight to plan ahead feasible future paths.
Drawing inspiration from the Aâ± search algorithm, we propose NeuroLogic Aâ±esque, a decoding algorithm that incorporates heuristic estimates of future cost. We develop efficient lookahead heuristics that are efficient for large-scale language models [GPT-2], making our method a drop-in replacement for common techniques such as beam search and top-k sampling. To enable constrained generation, we build on NeuroLogic decoding ( et al 2021), combining its flexibility in incorporating logical constraints with Aâ±-esque estimates of future constraint satisfaction.
Our approach outperforms competitive baselines on 5 generation tasks, and achieves new state-of-the-art performance on table-to-text generation, constrained machine translation, and keyword-constrained generation. The improvements are particularly notable on tasks that require complex constraint satisfaction or in few-shot or zero-shot settings.
NeuroLogic Aâ±esque illustrates the power of decoding for improving and enabling new capabilities of large-scale language models.