When we were invited to write a retrospective article about our 1999 AAAI conference paper on mutual bootstrapping (Riloff and Jones 1999), our first
reaction was hesitation because, well, that algorithm seems
old and clunky now. But upon reflection, we realized that
this early work shaped a great deal of subsequent work on
bootstrapped learning for natural language processing, both
by ourselves and others. So our second reaction was enthusiasm for the opportunity to think about the path from 1999
to 2018 and to share the lessons we’ve learned about bootstrapped learning along the way.
This article begins with a brief history of related research
that preceded and inspired the mutual bootstrapping work, to
position it with respect to that period of time. We then
describe the general ideas and approach behind the mutual
bootstrapping algorithm. Next, we survey several types of
research that have followed and that share similar themes:
multiview learning, bootstrapped lexicon induction, and
bootstrapped pattern learning. Finally, we discuss some of the
general lessons that we have learned about bootstrapping
techniques for natural language processing (NLP) to offer guidance to researchers and practitioners who may be interested in
exploring these types of techniques in their own work.
A Retrospective on
Mutual Bootstrapping
Ellen Riloff, Rosie Jones
; This retrospective article discusses
the mutual bootstrapping technique for
weakly supervised learning of extraction
patterns and semantic lexicons from
unstructured text, originally published
in AAAI’ 99 (Riloff and Jones 1999). We
present an overview of the mutual bootstrapping approach, describe related
research that has followed from the
original work, and discuss lessons
learned about bootstrapped learning for
natural language processing.