Transformers, contextualism, and polysemy
Ergo: An Open Access Journal of Philosophy (forthcoming)
  Copy   BIBTEX

Abstract

The transformer architecture, introduced by Vaswani et al. (2017), is at the heart of the remarkable recent progress in the development of language models, including widely-used chatbots such as Chat-GPT and Claude. In this paper, I argue that we can extract from the way the transformer architecture works a theory of the relationship between context and meaning. I call this the transformer theory, and I argue that it is novel with regard to two related philosophical debates: the contextualism debate regarding the extent of context-sensitivity across natural language, and the polysemy debate regarding how polysemy should be captured within an account of word meaning.

Author's Profile

Jumbly Grindrod
University of Reading

Analytics

Added to PP
2025-02-15

Downloads
525 (#88,037)

6 months
372 (#16,319)

Historical graph of downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.
How can I increase my downloads?