Software accompanying the paper Constraining Linear-chain CRFs to Regular Languages



Sean Papay, Roman Klinger, Sebastian Padó


In structured prediction, a major challenge for models is to represent the interdependencies within their output structures.  For the common case where outputs are structured as a sequence, linear-chain conditional random fields (CRFs) are a widely used model class which can learn local dependencies in output sequences.  However, the CRF's Markov assumption makes it impossible for these models to capture nonlocal dependencies, and standard CRFs are unable to respect nonlocal
constraints of the data (such as global arity constraints on output labels).  We present a generalization of CRFs that can enforce a broad class of constraints, including nonlocal ones, by specifying the space of possible output structures as a regular language L. The resulting regular-constrained CRF (RegCCRF) has the same formal properties as a standard CRF, but assigns zero probability to all label sequences not in L. Notably, RegCCRFs can incorporate their constraints during training, while related models only enforce constraints during decoding.


Sean Papay, Roman Klinger, Sebastian Padó: "Constraining Linear-chain CRFs to Regular Languages".

This image shows Roman Klinger

Roman Klinger

Prof. Dr.

Adjunct Professor

This image shows Sebastian Padó

Sebastian Padó

Prof. Dr.

Chair of Theoretical Computational Linguistics, Managing Director of the IMS

To the top of the page