Authors:

(1) Lorenzo Catani, International Iberian Nanotechnology Laboratory, Av. Mestre Jose Veiga s/n, 4715-330 Braga, Portugal ([email protected]);

(2) Thomas D. Galley, Institute for Quantum Optics and Quantum Information, Austrian Academy of Sciences, Boltzmanngasse 3, A-1090 Vienna, Austria and Vienna Center for Quantum Science and Technology (VCQ), Faculty of Physics, University of Vienna, Vienna, Austria ([email protected]);

(3) Tomas Gonda, Institute for Theoretical Physics, University of Innsbruck, Austria ([email protected]).

Abstract and 1. Introduction

  1. Operational theories, ontological models and contextuality

  2. Contextuality for general probabilistic theories

    3.1 GPT systems

    3.2 Operational theory associated to a GPT system

    3.3 Simulations of GPT systems

    3.4 Properties of univalent simulations

  3. Hierarchy of contextuality and 4.1 Motivation and the resource theory

    4.2 Contextuality of composite systems

    4.3 Quantifying contextuality via the classical excess

    4.4 Parity oblivious multiplexing success probability with free classical resources as a measure of contextuality

  4. Discussion

    5.1 Contextuality and information erasure

    5.2 Relation with previous works on contextuality and GPTs

  5. Conclusion, Acknowledgments, and References

A Physicality of the Holevo projection

Abstract

In this work we present a hierarchy of generalized contextuality. It refines the traditional binary distinction between contextual and non-contextual theories, and facilitates their comparison based on how contextual they are. Our approach focuses on the contextuality of prepare-and-measure scenarios, described by general probabilistic theories (GPTs). To motivate the hierarchy, we define it as the resource ordering of a novel resource theory of GPT-contextuality. The building blocks of its free operations are classical systems and GPT-embeddings. The latter are simulations of one GPT by another, which preserve the operational equivalences and thus cannot generate contextuality. Non-contextual theories can be recovered as least elements in the hierarchy. We then define a new contextuality monotone, called classical excess, given by the minimal error of embedding a GPT within an infinite classical system. In addition, we show that the optimal success probability in the parity oblivious multiplexing game also defines a monotone in our resource theory. We end with a discussion of a potential interpretation of the non-free operations of the resource theory of GPT-contextuality as expressing a kind of information erasure.

1 Introduction

Generalized (non)contextuality. A crucial research question in the foundations of quantum theory is to identify those features of quantum theory that constitute a true departure from the classical worldview. Addressing this question requires one to first establish a good notion of classicality, which adequately captures the classical worldview. We believe that a good notion of classicality should satisfy the following desiderata (see also [1]): (1) it endorses a principle that defines a clear boundary between aspects that pose interpretational issues and those that do not, (2) it is of broad range of applicability, (3) it is empirically testable, and (4) its violation constitutes a resource for practical applications, in particular in quantum information processing.[1] Motivated by these desiderata, a leading notion of classicality is generalized noncontextuality [2].[2] A non-contextual theory is one that is compatible with a classical realist explanation of its operational predictions— it admits of a non-contextual ontological model. In such a model, any two experimental procedures that the theory predicts to be operationally indistinguishable also have the same ontological representation (see Section 2 for a precise definition ).

Noncontextuality satisfies (1), in that it is an instance of a methodological principle inspired by Leibniz’s principle of the identity of indiscernibles [3] (also formulated as a no fine-tuning principle [4]). A violation of such principle would indeed entail an interpretational problem, as it would attribute a conspiratorial connotation to the realist explanation of the theory: why should experimental procedures predicted by the theory to be indistinguishable in principle be represented by different distributions in the ontological model? Noncontextuality also satisfies (2), as it applies to a wide range of scenarios including prepare-and-measure experiments of a single system, unlike Bell’s local causality— another leading notion of classicality. Moreover, in situations where these are applicable, it coincides with notions of classicality such as non-negativity of quasi-probability representations [5, 6] and Bell’s local causality [7]. It satisfies (3), as witnessed by the experiments performed to test quantum violations of generalized noncontextuality [8, 9]. Finally, desideratum (4) has been argued by the numerous works showing that contextuality is a resource for information processing tasks [10–21].

Hierarchy of contextuality. In relation to the last point, contextuality can be used to witness and characterize the advantage of using quantum physical systems in practical applications. In such contexts, it is important to know not just whether a theory is contextual, but also to quantify how contextual it is. Our article aims to address this by introducing a hierarchy of contextuality, in which not all contextual phenomena are equivalent. In particular, it allows us to make more fine-grained distinctions and to quantify the amount of contextuality present in a theory.

In our case, resource objects are instead physical theories, since (non)contextuality is a property of theories rather than states. However, as contextuality can be witnessed by individual systems within the theory, we restrict our investigation to individual systems. That is, we identify each resource object as a system in a general probabilistic theory (GPT) [27–30], such as classical probability theory, quantum theory, or a subtheory thereof like the stabilizer subtheory [31]. The GPT system specifies all possible probabilistic behaviours of this physical system and thus characterizes its information-theoretic properties.

Since our resources are not states of a physical system, but rather physical theories, this implies that the transformations that we consider cannot be standard physical operations, as is the case for traditional resource theories of quantum states. Our operations of interest are simulations, which are transformations between theories that faithfully encode the information of one theory within another. Simulations that preserve indistinguishability have been introduced in [32] as univalent simulations. Since noncontextuality of an ontological model means that the model corresponds to a univalent simulation, one can show [32] that a GPT system is non-contextual if and only if it can be simulated in a univalent way by a classical GPT system— one, whose states form a simplex of probability distributions on a finite sample space. Classical GPT systems thus cannot generate contextuality under univalent simulations. Therefore, we propose the following hierarchy of (generalized) contextuality:

A GPT system B is said to be at least as contextual as a GPT system A is if there exists a univalent simulation of A by a composite of B and a classical GPT system.

It is the resource ordering of a resource theory with objects given by GPT systems and free operations given by univalent simulations with free access to classical systems. In Section 5.1, we discuss a possible interpretation of (non-univalent) simulations as expressing a particular kind of information erasure.

Contextuality monotones. As is common in resource theories, the hierarchy of contextuality is not a total order— there are GPT systems such that neither is at least as contextual as the other one. It is not even a partial order, because there exist distinct GPT systems which are equivalent. For example, this is the case for all non-contextual systems, such as the classical bit and the classical trit. Therefore, the hierarchy is given by a preorder and it cannot be fully represented by a single numerical value— the “amount of contextuality”— assigned to each GPT system. However, in order to capture certain aspects of the hierarchy it is useful to define quantities which are order-preserving assignments of a number to each resource object. These are called resource monotones.

We define a new contextuality monotone that we call classical excess. It expresses the minimal error of a univalent simulation of a given GPT system by any classical system. In addition, we show that the optimal success probability for the parity-oblivious-multiplexing (POM) protocol [6] with free classical systems is a monotone.

Previous works on the matter. Resource-theoretic perspective on contextuality has been developed in several works in the past. Most of these focus on the Kochen–Specker notion of contextuality [33]. Despite being related to the generalized notion of contextuality (Kochen–Specker noncontextuality is the conjunction of measurement noncontextuality and outcome determinism for sharp measurements), Kochen–Specker contextuality favors certain frameworks that, instead, are not appropriate for developing a resource theory of generalized noncontextuality. The work of Abramsky, Barbosa, and Mansfield [34] uses a framework whose main objects are empirical models — tables of data, specifying probability distributions over the joint outcomes of sets of compatible measurements. This framework is further developed in [35] and [36] and is based on the sheaf theoretic approach to contextuality introduced in [37]. Existing quantifications of Kochen– Specker contextuality are based on the memory cost [38], the ratio of contextual assignments [39], the relative entropy and contextual cost [40], the contextual robustness [41], the contextual fraction [34], and the rank of contextuality [42]. In [43] a review of several of the previous approaches towards a resource theory of Kochen–Specker contextuality is presented.

The first work on a resource theory of generalized contextuality in prepare-and-measure scenarios was presented by Duarte and Amaral in [44]. They use the generalized-noncontextual polytope characterizing the contextual set of prepared-and-measured statistics defined in [45] to motivate the set of free operations and then define monotones based on known resource quantifiers for contextuality and nonlocality. As an application of such a resource theory, [46] uses it to simplify and robustify proofs of contextuality.

On the use of general probabilistic theories. Unlike Duarte and Amaral, we use the framework of GPTs. A GPT system consists of a collection of states, effects and a probability assigned to each pair of a state and an effect. The so-called simplex embeddability condition [47] states that a GPT system is non-contextual if and only if the associated GPT system can be embedded within a classical GPT system. This makes the GPT framework a suitable one to study generalized contextuality. The simplex embeddability condition thus characterizes the qualitative divide between contextual and noncontextual GPT systems. Our hierarchy of contextuality can be seen as a refinement thereof. It makes distinctions between contextual systems based on how (and how much) they violate simplex embeddability. In practice, this is achieved employing the notion of simulations of GPT systems from other GPT systems, as defined in [32]. We dedicate Section 5.2 to the discussion of the relation with other works that study contextuality in GPTs, namely [32, 44, 47–52].

Structure of the paper. In Section 2, we recall the standard treatment of generalized noncontextuality in the framework of operational theories and ontological models. This is connected to the language of general probabilistic theories in Section 3. There, we also describe (univalent) simulations of GPT systems and the excess measure. The hierarchy of contextuality is introduced in Section 4, where we prove that the classical excess is a resource monotone (Theorem 26) and construct a monotone from the parity-oblivious-multiplexing protocol. In Section 5, we discuss a possible interpretation of the non-free operations as involving information erasure and the relation to previous works on contextuality and GPTs. We conclude with a summary of the results and an outline of possible future research directions in Section 6.

2 Operational theories, ontological models and contextuality

In this work we consider prepare-and-measure scenarios associated with a single system. An operational theory associated with a prepare-and-measure scenario is defined by a list of possible preparations, measurements and the probabilities P(k|P, M) of obtaining the outcome k of the measurement M given that the system is prepared in the preparation P. An ontological model of

An ontological model is preparation noncontextual if operationally equivalent preparation procedures are represented by identical probability distributions in the ontological model [2]. More formally, two preparation procedures P and P′ are operationally equivalent if they provide the same operational statistics for all possible measurements, i.e., ∀M : P(k|P, M) = P(k|P′, M). In this case, we write P ≃ P′. An ontological model is preparation non-contextual if any two such preparations are represented by the same epistemic states:

Similarly, two measurement outcomes [k|M] and [k′ |M′] are operationally equivalent if they give the same statistics for all possible preparations: P(k|M, P) = P(k ′ |M′ , P) for all preparations P. In this case, we write [k|M] ≃ [k′ |M′]. An ontological model is measurement noncontextual if any two such measurement outcomes are represented by the same response functions:

An operational theory is termed preparation noncontextual (resp. measurement noncontextual) if there exists a preparation noncontextual (resp. measurement noncontextual) ontological model for the theory, while it is termed preparation contextual (resp. measurement contextual) if it does not admit of a preparation noncontextual (resp. measurement noncontextual) ontological model. It was first proven in [2] that quantum theory is preparation contextual and measurement noncontextual when outcome determinism is not assumed.

We note that the notion of generalised contextuality extends to transformations also [2]. In the present work we will be concerned only with prepare-and-measure scenarios and as such we will ignore transformation contextuality.

This paper is available on arxiv under CC BY 4.0 DEED license.

[1] This last desideratum is motivated by the belief that identifying the true non-classicality of quantum theory will ultimately provide the answer to the question about the origin of the alleged quantum computational speed-up.

[2] In what follows we will often omit “generalized” and just talk of “(non)contextuality”.

[3] A preorder is a relation that is both reflexive and transitive. It is a partial order if it is also anti-symmetric. If, in addition, any two elements are related, then it is a total order.