Hi, I'm Gabriel Ziegler, a Lecturer in Economics at the University of Edinburgh.
Research themes: Information Economics, Game Theory
Current Contact Details:
University of Edinburgh
School of Economics
31 Buccleuch Place, Office 2.14
Edinburgh, EH8 9JT, UK
Email: ziegler{AT}ed.ac.uk
Online Profiles:
Working Papers
Abstract: Performance of binary classifiers is often measured against a reference classifier when the true class is not readily observable. Reference classifiers are commonly also imperfect. A prominent example are diagnostic tests whose performance is measured against an established reference test. An imperfect reference generally leads to partial identification of relevant performance measures, which induces ambiguity in the interpretation of a predicted class. Seidenfeld and Wasserman (1993) define dilation as an extreme notion of non-informativeness when information is ambiguous. This paper characterizes precise conditions under which dilation arises in the context of binary classifiers when the reference classifier is imperfect and develops procedures for statistical inference based on methods for subvector inference in moment inequality models. We illustrate the usefulness of our approach through the application of two distinct case studies: radiologists' assessment of CT Chest scans for COVID-19 in Ai et al. (2020), and the utilization of artificial intelligence for the detection of COVID-19 in Mei et al. (2020).
Working Paper: Link
Abstract: The Efficiency-Adjusted Deferred Acceptance (EADA) mechanism corrects the Pareto-inefficiency of the celebrated Deferred Acceptance (DA) algorithm by assigning every student to a weakly more preferred school. However, it remains uncertain which and how many students do not see an improvement in their DA placement under EADA. We show that, despite its advantages, EADA does not benefit students assigned to their worst-ranked schools or those who remain unmatched under DA. Additionally, it limits the placement improvement of marginalized students, thereby maintaining school segregation. The placement of worst-off students under EADA can be exceptionally poor, even though significantly more egalitarian allocations are possible. Lastly, we provide a bound on the expected number of unimproved students using a random market approach valid for small markets. Our findings shed light on why EADA fails to mitigate the inequality produced by DA in empirical evaluations.
Working Paper: arXiv
Adversarial Bilateral Information Design
Abstract: Information provision is a significant component of business-to-business interaction. Furthermore, the provision of information is often conducted bilaterally. This precludes the possibility of commitment to a grand information structure if there are multiple receivers. Consequently, in a strategic situation, each receiver needs to reason about what information other receivers get. Since the information provider does not know this reasoning process, a motivation for a robustness requirement arises: the provider seeks an information structure that performs well no matter how the receivers actually reason. In this paper, I provide a general method to study how to optimally provide information under these constraints. The main result is a representation theorem, which makes the problem tractable by reformulating it as a linear program in belief space. Furthermore, I provide novel bounds on the dependence among receivers’ beliefs, which provide even more tractability in some special cases.
Publications
Optimism and Pessimism in Strategic Interactions under Ignorance (with Pierfrancesco Guarino)
Games and Economic Behavior 136 (2022), pp. 559–585
Abstract: We study players interacting under the veil of ignorance, who have—coarse—beliefs represented as subsets of opponents' actions. We analyze when these players follow maxmin or maxmax decision criteria, which we identify with pessimistic or optimistic attitudes, respectively. Explicitly formalizing these attitudes and how players reason interactively under ignorance, we characterize the behavioral implications related to common belief in these events: while optimism is related to Point Rationalizability, a new algorithm—Wald Rationalizability—captures pessimism. Our characterizations allow us to uncover novel results: (i) regarding optimism, we relate it to wishful thinking á la Yildiz (2007) and we prove that dropping the (implicit) ``belief-implies-truth'' assumption reverses an existence failure described therein; (ii) we shed light on the notion of rationality in ordinal games; (iii) we clarify the conceptual underpinnings behind a discontinuity in Rationalizability hinted in the analysis of Weinstein (2016).
Paper: Published version or arXiv preprint
Informational Robustness of Common Belief in Rationality
Games and Economic Behavior 132 (2022), pp. 592–597
Abstract: In this note, I explore the implications of informational robustness under the assumption of common belief in rationality. That is, predictions for incomplete-information games which are valid across all possible information structures. First, I address this question from a global perspective and then generalize the analysis to allow for localized informational robustness.
Paper: Published version or arXiv preprint
Strategic Cautiousness as an Expression of Robustness to Ambiguity (with Peio Zuazo-Garin)
Games and Economic Behavior 119 (2020), pp. 197–215
Abstract: Economic predictions often hinge on two intuitive premises: agents rule out the possibility of others choosing unreasonable strategies ('strategic reasoning'), and prefer strategies that hedge against unexpected behavior ('cautiousness'). These two premises conflict and this undermines the compatibility of usual economic predictions with reasoning-based foundations. This paper proposes a new take on this classical tension by interpreting cautiousness as robustness to ambiguity. We formalize this via a model of incomplete preferences, where (i) each player's strategic uncertainty is represented by a possibly non-singleton set of beliefs and (ii) a rational player chooses a strategy that is a best-reply to every belief in this set. We show that the interplay between these two features precludes the conflict between strategic reasoning and cautiousness and therefore solves the inclusion-exclusion problem raised by Samuelson (1992). Notably, our approach provides a simple foundation for the iterated elimination of weakly dominated strategies.
Paper: Published version or Preprint
Older and Dormant Working Papers & Miscellany
Abstract: We study the behavioral implications of Rationality and Common Strong Belief in Rationality (RCSBR) with contextual assumptions allowing players to entertain misaligned beliefs, i.e., players can hold beliefs concerning their opponents’ beliefs where there is no opponent holding those very beliefs. Taking the analysts’ perspective, we distinguish the infinite hierarchies of beliefs actually held by players (“real types”) from those that are a byproduct of players’ hierarchies (“imaginary types”) by introducing the notion of separating type structure. We characterize the behavioral implications of RCSBR for the real types across all separating type structures via a family of subsets of Full Strong Best-Reply Sets of Battigalli & Friedenberg (2012). By allowing misalignment, in dynamic games we can obtain behavioral predictions inconsistent with RCSBR (in the standard framework), contrary to the case of belief-based analyses for static games—a difference due to the dichotomy “non-monotonic vs. monotonic” reasoning.
Working Paper: arXiv
Binary Classification Tests, Imperfect Standards, and Ambiguous Information
Covid Economics 66 (2021), pp. 1–36
Made obsolete and superseded by Binary Classifiers as Dilations (see above).
Abstract: New binary classification tests are often evaluated relative to another established test. For example, rapid Antigen test for the detection of SARS-CoV-2 are assessed relative to more established PCR tests. In this paper, I argue that the new test (i.e. the Antigen test) describes ambiguous information and therefore allows for a phenomenon called dilation -- an extreme form of non-informativeness. As an example, I present hypothetical test data satisfying the WHO's minimum quality requirement for rapid Antigen tests which leads to dilation. The ambiguity in the information arises from a missing data problem due to imperfection of the established test: the joint distribution of true infection and test results is not observed. Using results from Copula theory, I construct the (usually non-singleton) set of all these possible joint distributions, which allows me to address the new test's informativeness. This analysis leads to a simple sufficient conditions to make sure that a new test is not a dilation. Finally, I illustrate my approach with applications to actual test data. Rapid Antigen tests satisfy my sufficient condition easily and are therefore informative. A bit more problematic might be other approaches like chest CT scans.
Working Paper: arXiv or Covid Economics
How many people are infected? A case study on SARS-CoV-2 prevalence in Austria
Abstract: Using recent data from voluntary mass testing, I provide credible bounds on prevalence of SARS-CoV-2 for Austrian counties in early December 2020. When estimating prevalence, a natural missing data problem arises: no test results are generated for non-tested people. In addition, tests are not perfectly predictive for the underlying infection. This is particularly relevant for mass SARS-CoV-2 testing as these are conducted with rapid Antigen tests, which are known to be somewhat imprecise. Using insights from the literature on partial identification, I propose a framework addressing both issues at once. I use the framework to study differing selection assumptions for the Austrian data. Whereas weak monotone selection assumptions provide limited identification power, reasonably stronger assumptions reduce the uncertainty on prevalence significantly.
Working Paper: arXiv