Quantum Certification Conference (QUACC), to be held on November 6-8, 2023 at the Center for Theoretical Physics PAS, Warsaw (Poland), aims to attract researchers working on widely understood quantum certification, and to inspire vivid scientific discussions and foster new collaborations. During the conference, attendees will have an opportunity to contribute a short talk (20 + 5 min) or a poster as well as listen to the invited talks (45 + 5 min) from the most prominent researchers in the field.
QUACC is a part of the QuantERA project “Verification of Quantum Technologies, Applications and Systems” (veriqtas.cft.edu.pl), and so it is also an opportunity for meeting between the research groups forming VERIqTAS consortium.
The conference fee is 50€.
Participants can apply for a fee reduction or fee waiver to the Organizers. Participants from CFT (employees and PhD students) do not have to cover the fee.
Please contact us at quacc2023@cft.edu.pl in case you need any assistance.
Warning!
We have confirmed email scams targeting participants of QUACC 2023. Please ignore emails from "ops@travellerpoint dot org", or any other travel agency. Do not reply to the emails and do not click any links included in these emails.
Organizers (LOC):
Remigiusz Augusiak (Center for Theoretical Physics PAS, Warsaw)
Owidiusz Makuta (Center for Theoretical Physics PAS, Warsaw)
Alexandre Orthey (Center for Theoretical Physics PAS, Warsaw)
Scientific Committee:
Antonio Acín (ICFO, Castelldefels)
Remigiusz Augusiak (Center for Theoretical Physics PAS, Warsaw)
Omar Fawzi (Inria, Lyon)
Laura Mančinska (University of Copenhagen)
Miguel Navascués (IQOQI, Vienna)
Stefano Pironio (Free University of Brussels)
Information causality was initially proposed as a physical principle aimed at deriving the predictions of quantum mechanics on the type of correlations observed in the Bell experiment. In the same work, information causality was famously shown to imply the Uffink inequality that approximates the set of quantum correlations and rederives Tsirelson's bound of the Clauser-Horne-Shimony-Holt inequality. This result found limited generalizations due to the difficulty of deducing implications of the information causality principle on the set of nonlocal correlations. In this paper, we present a simple technique for obtaining polynomial inequalities from information causality, bounding the set of physical correlations in any Bell scenario. To demonstrate our method, we derive a family of inequalities that non-trivially constrains the set of nonlocal correlations in Bell scenarios with binary outcomes and an equal number of measurement settings. Finally, we propose an improved statement of the information causality principle, obtain tighter constraints for the simplest Bell scenario that goes beyond the Uffink inequality, and recover a part of the boundary of the quantum set.
In this talk, based on joint work with Michael Walter, I want to take a deeper dive into the Huang-Kueng-Preskill shadow tomography protocol. I will adress two problems: (1) whether it is statistically disadvantageous to repeat randomly sampled circuits, and (2) whether the median-of-means estimator can be replaced with a regular mean estimator without losing exponential concentration (which is required for the seemingly magical logarithmic sample complexity of shadow tomography). We will see that in both cases the answer depends strongly on the underlying gateset, even when that gateset is already a 3-design. In particular, in both cases the Clifford group performs poorly while fully Haar random gates perform well. We also consider efficiently constructible circuit families that interpolate between these two behaviours. On the technical side, we lean strongly on Weingarten calculus and its recently developed Clifford counterpart. We give upper and lower bounds for moments of stabilizer state estimators which might be of independent interest.
Device Independent (DI) cryptography claims to provide secure communication without any trust in the devices that the parties are using. It is a beautiful concept but hardly practical. Current experimental realizations are very slow and have limited range, which makes them useless in real-life applications. Moreover, there are no possibilities in sight that would change this in foreseeable future. One of possible solutions lies in Semi-DI cryptography, which makes some assumptions about the devices used. These can include a bound on the dimension of the Hilbert space of system communicated or trust in some of the devices used. However, such assumptions are very difficult to justify. In this presentation I propose two different DI protocols which have assumptions that are easy to justify in practice. These solutions combine security of full DI with the greater ease of implementation of Semi-DI.
The ability to manipulate large nonclassical quantum systems is a key ability for quantum technologies. As most quantum experimental platforms are far from any useful practical quantum advantage, e.g. for quantum computing, certifying this ability is an important benchmark to assess the progress of these technologies. This can be done using the nonlocal nature of quantum correlations, which allows to certify a non-trusted experimental apparatus from its input/output behaviour in a device independent way. It first requires to introduce the concept of Genuine Multipartite Nonlocality (GMNL) of size n, which designate systems which nonlocality cannot be understood an obtained from many states composed of n − 1 (or less) constituents.
The first historical definition of GMNL, proposed by Svetlichny, is ill-defined when used to assess the large nonclassical nature of quantum systems, as it predicts that maximal GMNL states can be obtain from bipartite sources only. A more appropriate re-definition of that concept, called LOSR-GMNL, was proposed recently [arXiv:2105.09381]. However, it is not satisfactory in all experimental situations, as it cannot (by design) capture potential communications between the systems which could occur in some realistic experimental systems (e.g., many-body systems) – which Svetlichny definition captures, however in a naïve way.
In this talk, I will propose a new alternative re-definition solving this issue, called Communication-Genuine Multipartite Nonlocality of length t (C-GMNL). It is based on a model inspired from synchronous distributed computing, that involves t communications steps along a graph. Then, I will show that (i) the GHZ state is maximally nonlocal according to this C-GMNL definition, (ii) the cluster state is trivial in this C-GMNL definition but that (iii) the cluster state is maximally difficult in the LOSR-GMNL definition. Hence, some complicated LOSR-GMNL states become trivial when a small amount of communication is allowed.
Based on a joint work in preparation with Xavier Coiteux-Roy, Owidiusz Makuta, Fionnuala Curran, Remigiusz Augusiak.
Authentication of quantum sources is a crucial task in building reliable and efficient protocols for quantum-information processing. Steady progress vis-à-vis verification of quantum devices in the scenario with fully characterized measurement devices has been observed in recent years. When it comes to the scenario with uncharacterized measurements, the so-called black-box scenario, practical verification methods are still rather scarce. Development of self-testing methods is an important step forward, but these results so far have been used for reliable verification only by considering the asymptotic behavior of large, identically and independently distributed (IID) samples of a quantum resource. Such strong assumptions deprive the verification procedure of its truly device-independent character. In this paper, we develop a systematic approach to device-independent verification of quantum states free of IID assumptions in the finite copy regime. Remarkably, we show that device-independent verification can be performed with optimal sample efficiency. Finally, for the case of independent copies, we develop a device-independent protocol for quantum state certification: a protocol in which a fragment of the resource copies is measured to warrant the rest of the copies to be close to some target state.
Many problems in quantum information theory can be formulated as optimizations over the sequential outcomes of dynamical systems subject to unpredictable external influences. Such problems include many-body entanglement detection through adaptive measurements, computing the maximum average score of a preparation game over a continuous set of target states and limiting the behavior of a (quantum) finite-state automaton. In this talk, I will show how to formulate tractable relaxations of this class of optimization problems. To illustrate their performance, we use them to: compute the probability that a finite-state automaton outputs a given sequence of bits and to develop a new many-body entanglement detection protocol. As we further show, the maximum score of a sequential problem in the limit of infinitely many time-steps is in general incomputable. Nonetheless, we provide general heuristics to bound this quantity and show that they provide useful estimates in relevant scenarios.
A ubiquitous problem in physics is to understand the ground-state properties of classical and quantum many-body systems. It is also one of the main applications of first-generation of quantum computing devices, such as quantum optimisers or simulators. Classically, since an exact solution soon becomes too costly when increasing the system size, variational approaches are often employed as a scalable alternative: energy is minimised over a subset of all possible states and then different physical quantities are computed over the solution state. However, strictly speaking, all what these methods provide are provable upper bounds on ground-state energy. Relaxations to the ground-state problem based on semi-definite programming represent a complementary approach, providing lower bounds to the ground-state energy but, again, no provable bound on any other relevant quantity. We first discuss how these relaxations can be useful to benchmark the performance of quantum optimisers. After that, we show how relaxations, when assisted with an energy upper bound, can be used to derive certifiable bounds on the value of any physical parameter, such as correlation functions of arbitrary degree or structure factors, at the ground state. We illustrate the approach in paradigmatic examples of 1D and 2D spin-one-half systems.
In this work we study the phenomenon of self-testing from the first principles, aiming to place this versatile concept on a rigorous mathematical footing. Self-testing allows a classical verifier to infer a quantum mechanical description of untrusted quantum devices that she interacts with in a black-box manner. Somewhat contrary to the black-box paradigm, existing self-testing results tend to presuppose conditions that constrain the operation of the untrusted devices. A common assumption is that these devices perform a projective measurement of a pure quantum state. Naturally, in the absence of any prior knowledge it would be appropriate to model these devices as measuring a mixed state using POVM measurements, since the purifying/dilating spaces could be held by the environment or an adversary.
We prove a general theorem allowing to remove these assumptions, thereby promoting most existing self-testing results to their assumption-free variants. On the other hand, we pin-point situations where assumptions cannot be lifted without loss of generality. As a key (counter)example we identify a quantum correlation which is a self-test only if certain assumptions are made. Remarkably, this is also the first example of a correlation that cannot be implemented using projective measurements on a bipartite state of full Schmidt rank. Finally, we compare existing self-testing definitions, establishing many equivalences as well as identifying subtle differences.
We consider the problem of reliable information transmission between parties connected by a classical noisy communication channel. We explore the gains in communication capacity that can be achieved by making use of shared entanglement and non-signalling correlations between the parties.
Based on https://arxiv.org/abs/1508.04095, https://arxiv.org/abs/2206.10968 and https://arxiv.org/abs/2310.05515.
We consider non-commutative optimization problems where a subset of the variables are the solutions of a system of ordinary differential equations on operators. These problems appear in quantum information theory when, e.g., we wish to extrapolate an experimental time series, or mitigate the error of a noisy intermediate-scale quantum device. We show that any such problem can be relaxed to a non-commutative polynomial optimization (NPO) problem, which in turn can be solved using standard techniques. We derive sufficient conditions to guarantee that the NPO relaxation is equivalent to the original problem. Finally, we present numerical results that evidence the power of this new method.
Semidefinite programming relaxation methods have been introduced for polynomial optimization both in the fully commutative and fully non-commutative case. I will first remind how to generically implement polynomial equality constraints in such methods by doing computations modulo the ideal generated by the equality constraints through Groebner bases. In principle, commutation relations between subsets of operators, such as [A,B]=BA-AB=0, can also be dealt with in this way. However, one is faced with the problem that i) even for such simple constraints, the associate Groebner basis might be infinite and ii) even if it is finite, a naive Groebner basis reduction is typically time-consuming. I will then present an alternative approach based on a new representation of monomials that takes into account from the beginning partial commutation relations and which generalizes the fully commutative and fully non-commutative cases to the entire spectrum of possibilities in between.