Reblog of a post by Ernesto Galvão, first published at Quantum Rio.

—————————————————-

Today we had our first day of the VI Quantum Information Workshop in Paraty, a delightful little colonial town in Rio de Janeiro state. This is the sixth instalment of this popular event, marking 10 years of the first one. After the School last week, in this week’s workshop we have shorter research talks and a poster session (coming up on Tuesday).

The first morning session was dedicated to quantum optical experiments. Gabriela Lemos (IIP, Natal) had the honor of giving the first talk, in which she revisited the Zou-Wang-Mandel setup with two aligned PDC crystals, to discuss conceptual issues that arise when using a single photon (instead of the original intense beam). Her work also explores how the interference fringes observed may explain the low visibility of previous experiments, and help map the transverse momentum correlations of each source.

Next Antonio Zelaquett Khoury (UFF) described how classical laser beams in the paraxial approximation can be described using complex vector spaces, giving rise to non-separability akin to quantum entanglement. These non-separable vortex beams violate CHSH-type and GHZ inequalities, in an analogue of the corresponding quantum mechanical predictions. Instead of physically separate objects, the measurements here are made on different degrees of freedom of the same beam. Zelaquett also described how an analogue of quantum teleportation can be used to transfer states from polarization to the transverse structure of the classical beam.

Werner Vogel was the third speaker, and described a series of papers analysing non-classicality of various states of light, using quasi-probability distributions such as the Glauber-Sudarshan P function, and a regularized version of it, proposed by Vogel himself. He described ingenious ways to used unbalanced homodyne correlation measurements to directly probe this non-classicality, finishing with an example of states which have no quantum discord or entanglement, but which are nevertheless non-classical according to this criterion.

The next two talks either reported experimental results or proposals for experiments. Both involve quantum dots, these nifty artificial atoms that can be made to order using semiconductors. Pierre-Louis de Assis (UNICAMP) described an interesting hybrid optomechanical system, in which light, a mechanical oscillator and a quantum dot interact in non-trivial and controllable ways. His nanowires contain quantum dots at the bottom, and when they vibrate the mechanical strains change the absorption-emission properties of the dots. This can be used to accurately measure the position of the dots even when deeply immersed in the substrate, which may be useful in applications such as indistinguishable single-photon sources.

Emil Denning (Technical University of Denmark) described charged quantum dots. When excited using a single-photon source (and under a certain magnetic field configuration), they may deterministically create either Bell pairs or GHZ states of three photons, potentially useful for measurement-based quantum computation protocols.

The last talk before lunch was by Francesco Tacchino (Pavia), who described quantum models for the (admittedly, very complex) dynamics of the Q cycle, responsible for electron and proton transfer through biological membranes. The techniques are typical of quantum optics (master equations and the like), but interestingly they also apply to these biological systems. In future work, he intends to investigate the possible role quantum effects play in these systems, a hot topic with earlier results regarding simpler systems involved in energy transfer in photosynthesis.

After scattering around Paraty for lunch and physics discussions, it was time for the afternoon session.

Gustavo Lima described a new class of Bell-type inequalities capable of detecting , in a device-independent way, whether measurements with more than two outcomes were employed. These inequalities were first proposed by Vertesi and Bene, who worked with Gustavo to obtain an inequality which is more robust to noise, and hence implementable in the lab. The setup consists of ultra-bright PDC sources of entangled photons, and special polarizers and other equipment to prevent losses, as a high visibility is required. Gustavo also described an application for the setup, which is to create more than one random bit per entangled pair measured.

The next talk, by Jonatan Bohr Brask (Genève), also addressed quantum random number generators. The simplest QRNGs require knowledge about the system: dark counts, detection efficiency etc. We’d like to obtain random numbers without this information, and without trusting the manufacturers of the devices. Complete device-independent tests, however, require Bell inequality violation with high detection efficiency, which are hard to implement. The goal is to find a good trade-off between how much trust we need, and the random number generation rate.

In 2014 they used a nonlinear dimension witness to generate partially device-independent random numbers. More recently, they proposed a second approach using unambiguous state discrimination, which was the main topic of this talk. The intuition is relatively simple: we have states which are not perfectly distinguishable, and experimentally implement an unambiguous discrimination protocol with near-optimal inconclusive rate. Then the occurrence of inconclusive outcomes must be random, at the risk of bringing the discrimination rate to above the allowed bound. Open question for theorists: can this approach be extended to quantum key distribution?

Daniel Cavalcanti (ICFO) presented his recent work on efficient entanglement witnesses for multi-qubit states. This is motivated by the complexity of current experiments, which feature states of a dozen or more qubits, whose characterization demands new theoretical tools. They characterized a set of correlations that includes the local set, and whose bounds can be found in a practical way using semi-definite programming.

One can choose how many bodies in the observables (even though this may be problematic, as in the case of GHZ states). As an important side-product, the dual SDP problem provides a Bell inequality which is practical to witness multi-body separability. Some tests indicate the test is robust even with randomly chosen measurements, and can be used for up to about 30 qubits.

Jacques Pienaar (IIP, Natal) introduced the topic of his talk, which was causal inference. He started by explaining how intervention enables us to rule out some possible causal explanations for some phenomena (for example, to test whether TV causes both smoking in youth, and cancer in old age, or whether perhaps smoking in youth is the factor that causes cancer later in life).

Different causal hypotheses may imply different conditional independence constraints (ICs). The difference introduced by quantum systems is that they satisfy fewer CI relations than classical systems with the same causal structure, due to their non-factorizability. We can force the quantum system to factorize by using projective measurements. If the quantum measurement is informationally complete, it is possible to find the function that updates our counterfactual distribution to the real distribution. Jacques showed some examples (simple causal structure, common cause), and how this works out for simple informationally complete measurements. This prompts a few open questions, regarding the role of QBism in this task, arguably one possible practical application of these ideas. Also, it raises the question of what should count as an intervention in quantum mechanics, as most of the common literature doesn’t count quantum measurement as an intervention.

To close up the first day of the Workshop, Rafael Chaves (IIP, Natal) talked about quantum instrumental tests. (Rafael was kind enough to send me a link to a video of a talk of his on this subject.) His introduction mentioned that the theory of causal inference was developed only in the 1990’s, despite Bell’s pioneering results of the 1960’s. In 2015 Ried, Spekkens et al. showed that in the quantum case, one can distinguish direct causation from common cause using observational data only (and no intervention). This is known to be impossible classically, so this is an example of the advantage provided by quantum systems in causal inference.

Then Rafael explained the instrumental scenario: does A cause B, or are all the correlations between them due the common ancestor (which can be thought of as a hidden variable)? To find out, we’d think we need intervention. However, due to a result by Balke and Pearl (JASA 1997), we can estimate the effect of interventions without actually intervening. The study of the quantum causal version of the instrumental scenario changes the interpretation of the violation of this instrumental inequality. Also, surprisingly, Rafael and collaborators showed that QM is not only incompatible with local hidden-variable theories, but also with non-local HVT with measurement dependence, i.e. when Bob’s measurement is determined by Alice’s hidden variables. Rafael seemed genuinely excited about the new possibilities envisioned for the study of quantum causal models, and invited us all to visit and collaborate in Natal!