Symbolic Mathematics

In a recent work (PolySimp ICLR 2021 MathAI Workshop) with Navin Goyal and Vishesh Agarwal, we explored Transformers’ abilities to perform multiple-step reasoning in well-defined purely symbolic tasks such as step-wise polynomial simplification. Polynomials can be written in a simple normal form as a sum of monomials which are ordered in a lexicographic order. For a polynomial which is not necessarily in this normal form, a sequence of simplification steps is applied to reach the fully simplified (i.e., in the normal form) polynomial. We propose a synthetic Polynomial dataset generation algorithm that generates polynomials with unique proof steps.

Through varying coefficient configurations, input representation, proof granularity, and extensive hyper-parameter tuning, we observe that Transformers consistently struggle with numeric multiplication. We explore two ways to mitigate this: Curriculum Learning and a Symbolic Calculator approach (where the numeric operations are offloaded to a calculator). Both approaches provide significant gains over the vanilla Transformers-based baseline.

Avatar
Somak Aditya
Assistant Professor

My research interests include integrating knowledge and enabling higher-order reasoning in AI.

Publications

Analyzing the Nuances of Transformers' Polynomial Simplification Abilities | In ICLR 2021 MathAI.
(2021).

PDF Poster Video symbolicmath