Inductive Natural Language Rules from Natural Language Facts
Language Models as Inductive Reasoners
Inductive reasoning is a core component of human intelligence.However, logic language can cause systematic problems for inductive reasoning such as disability of handling raw input such as natural language, sensitiveness to mislabeled data, and incapacity to handle ambiguous input.To this end, we propose a new task, which is to induce natural language rules from naturallanguage facts, and create a dataset termed deer containing 1.2k rule-fact pairs for the task, where rules and facts are written in natural language.Moreover, we provide the first and comprehensive analysis of how well pretrained language models can induce natural language `` rules''from natural language facts.We also propose a new framework drawing insights from philosophy literature for this task, which we show in the experiment section that surpasses baselines in both automatic and human evaluations.
Our proposed framework (CoLM) for inductive reasoning with natural language representation task.
Rule Proposer is a generative model based on input facts and desired rule template, aiming at generating (a large number of) rule candidates.
Deductive consistency evaluator, indiscriminate confirmation handler, generalization checker, and triviality detector are classification models that filter improper rules according to four requirements of the induced rules in inductive reasoning.