Added by: Chris Blake-Turner, Contributed by: Cailin O'Connor
Abstract: Vague predicates, those that exhibit borderline cases, pose a persistent problem for philosophers and logicians. Although they are ubiquitous in natural language, when used in a logical context, vague predicates lead to contradiction. This paper will address a question that is intimately related to this problem. Given their inherent imprecision, why do vague predicates arise in the first place? I discuss a variation of the signaling game where the state space is treated as contiguous, i.e., endowed with a metric that captures a similarity relation over states. This added structure is manifested in payoffs that reward approximate coordination between sender and receiver as well as perfect coordination. I evolve these games using a variation of Herrnstein reinforcement learning that better reflects the generalizing learning strategies real-world actors use in situations where states of the world are similar. In these simulations, signaling can develop very quickly, and the signals are vague in much the way ordinary language predicates are vague – they each exclusively apply to certain items, but for some transition period both signals apply to varying degrees. Moreover, I show that under certain parameter values, in particular when state spaces are large and time is limited, learning generalization of this sort yields strategies with higher payoffs than standard Herrnstein reinforcement learning. These models may then help explain why the phenomenon of vagueness arises in natural language: the learning strategies that allow actors to quickly and effectively develop signaling conventions in contiguous state spaces make it unavoidable
Comment:Export citation in BibTeX formatExport text citationView this text on PhilPapersExport citation in Reference Manager formatExport citation in EndNote formatExport citation in Zotero format
O'Connor, Cailin. The Evolution of Vagueness
2013, Erkenntnis (S4):1-21.
Can’t find it?
Contribute the texts you think should be here and we’ll add them soon!