This page collects work by Mark C. Wilson and coauthors on collective decision-making without monetary transfers. This area includes topics such as single-winner voting rules and allocation (fair division) of resources. The motivation is to find ``better" methods for making such decisions, where "better" is defined usually in terms of social welfare outcomes. It has been shown in the last several decades that every reasonable voting rule is susceptible to various ``paradoxes" (undesirable behaviour such as being strategically manipulable. It therefore (apparently) makes sense to quantify the extent to which this behaviour can occur, and this brings in probability theory. In earlier papers we adopted this viewpoint, refining the methodology and (in my opinion) being more rigorous than some previous authors. More recently, we realized that since dictatorship rules out these paradoxes, minimizing their occurrence is not a good idea in itself. The reason that dictatorships are not desirable is because it typically leads to low social welfare outcomes. So we now focus on maximizing welfare. This has some difficulties, in that interpersonal comparisons of utility cannot now be avoided. Also, different strategic behaviours lead to different answers. The space of voting rules is vast, and there is little reason to suspect that we have found all the good ones. Some papers here explore this space in more detail.
Publications
-
Distance rationalization of anonymous and homogeneous voting rules
(with Benjamin Hadjibeyli), submitted, 21pp.The concept of distance rationalizability of voting rules has been explored in recent years by several authors. All previous work has dealt with a definition in terms of preference profiles. However most voting rules in common use are anonymous and most are also homogeneous. In this case there is a much more succinct representation (using the voting simplex) of the rule. This representation has been widely used in the voting literature, but not in the context of distance rationalizability.
We first define distance rationalizability in this new framework and explain in detail the connection to the original definition. In doing so we unify, correct, and extend previous work. The simplex interpretation yields a natural connection to areas of continuous mathematics not seen before in the voting literature, namely Wasserstein spaces and Minkowski spaces. We prove some positive and some negative results about the decisiveness of distance rationalizable rules on the simplex. The positive results connect with the recently developed theory of hyperplane rules, while the negative ones exploit the fact that the $\ell^1$-norm is not strictly convex.
-
Iterated regret minimization in voting games
(with Miranda Emery), Proceedings of COMSOC 2014 (12pp).The game-theoretic solution concept Iterated Regret Minimization (IRM) was introduced recently by Halpern and Pass. We give the first application of IRM to simultaneous voting games. We study positional scoring rules in detail and give theoretical results demonstrating the bias of IRM toward sincere voting. We present comprehensive simulation results of the effect on social welfare of IRM compared to both sincere and optimal voting. The results fit into a broader research theme of the welfare consequences of strategic voting.
-
Asymptotics of the minimum manipulating coalition size for positional
voting rules under IC behaviour by Geoffrey Pritchard and Mark C.
Wilson.
We consider the problem of manipulation of elections using positional voting rules under Impartial Culture voter behaviour. We consider both the logical possibility of coalitional manipulation, and the number of voters that must be recruited to form a manipulating coalition. It is shown that the manipulation problem may be well approximated by a very simple linear program in two variables. This permits a comparative analysis of the asymptotic (large-population) manipulability of the various rules. It is seen that the manipulation resistance of positional rules with 5 or 6 (or more) candidates is quite different from the more commonly analyzed 3- and 4-candidate cases.
-
Probability calculations under the IAC hypothesis by Mark C. Wilson and Geoffrey
Pritchard.
We show how powerful algorithms recently developed for counting lattice points and computing volumes of convex polyhedra can be used to compute probabilities of a wide variety of events of interest in social choice theory. Several illustrative examples are given.
- Exact results
on manipulability of positional voting rules by Geoffrey Pritchard and Mark C. Wilson.
We consider 3-candidate elections under a general scoring rule and derive precise conditions for a given voting situation to be strategically manipulable by a given coalition of voters. We present an algorithm that makes use of these conditions to compute the minimum size M of a manipulating coalition for a given voting situation.
The algorithm works for any voter preference model --- here we present numerical results for IC and for IAC, for a selection of scoring rules, and for numbers of voters up to 150. A full description of the distribution of M is obtained, generalizing all previous work on the topic.
The results obtained show interesting phenomena and suggest several conjectures. In particular we see that rules ``between plurality and Borda" behave very differently from those ``between Borda and antiplurality".
Software
Data sources
Last updated: 2015-09-18