All posts by Peter Rohde

Quantum computer scientist, Mountaineer, Adventurer, Composer, Musician, Public Speaker, DJ

New paper: Scalable boson-sampling with time-bin encoding using a loop-based architecture

Full text here.

We present an architecture for arbitrarily scalable boson-sampling using two nested fiber loops. The architecture has fixed experimental complexity, irrespective of the size of the desired interferometer, whose scale is limited only by fiber and switch loss rates. The architecture employs time-bin encoding, whereby the incident photons form a pulse train, which enters the loops. Dynamically controlled loop coupling ratios allow the construction of the arbitrary linear optics interferometers required for boson-sampling. The architecture employs only a single point of interference and may thus be easier to stabilize than other approaches. The scheme has polynomial complexity and could be realized using demonstrated present-day technologies.

Comment: Will boson-sampling ever disprove the Extended Church-Turing thesis?

Full text here.

Boson-sampling is a highly simplified, but non-universal, approach to implementing optical quantum computation. It was shown by Aaronson & Arkhipov that this protocol cannot be efficiently classically simulated unless the polynomial hierarchy collapses, which would be a shocking result in computational complexity theory. Based on this, numerous authors have made the claim that experimental boson-sampling would provide evidence against, or disprove, the Extended Church-Turing thesis -- that any physically realisable system can be efficiently simulated on a Turing machine. We argue against this claim on the basis that, under a general, physically realistic independent error model, boson-sampling does not implement a provably hard computational problem in the asymptotic limit of large systems.

New paper: Boson sampling with photon-added coherent states

Full paper here.

Boson sampling is a simple and experimentally viable model for non-universal linear optics quantum computing. Boson sampling has been shown to implement a classically hard algorithm when fed with single photons. This raises the question as to whether there are other quantum states of light that implement similarly computationally complex problems. We consider a class of continuous variable states - photon added coherent states - and demonstrate their computational complexity when evolved using linear optical networks and measured using photodetection. We find that, provided the coherent state amplitudes are upper bounded by an inverse polynomial in the size of the system, the sampling problem remains computationally hard.

New paper: Self-avoiding quantum walks

Full paper here.

Quantum walks exhibit many unique characteristics compared to classical random walks. In the classical setting, self-avoiding random walks have been studied as a variation on the usual classical random walk. Classical self-avoiding random walks have found numerous applications, most notably in the modeling of protein folding. We consider the analogous problem in the quantum setting. We complement a quantum walk with a memory register that records where the walker has previously resided. The walker is then able to avoid returning back to previously visited sites. We parameterise the strength of the memory recording and the strength of the memory back-action on the walker's motion, and investigate their effect on the dynamics of the walk. We find that by manipulating these parameters the walk can be made to reproduce ideal quantum or classical random walk statistics, or a plethora of more elaborate diffusive phenomena. In some parameter regimes we observe a close correspondence between classical self-avoiding random walks and the quantum self-avoiding walk.

New paper: Quantum random walks on congested lattices

Full paper here.

We consider quantum random walks on congested lattices and contrast them to classical random walks. Congestion is modelled with lattices that contain static defects which reverse the walker's direction. We implement a dephasing process after each step which allows us to smoothly interpolate between classical and quantum random walkers as well as study the effect of dephasing on the quantum walk. Our key results show that a quantum walker escapes a finite boundary dramatically faster than a classical walker and that this advantage remains in the presence of heavily congested lattices. Also, we observe that a quantum walker is extremely sensitive to our model of dephasing.

New paper: Sampling generalized cat states with linear optics is probably hard

Full paper here.

Boson-sampling has been presented as a simplified model for linear optics quantum computing. In the boson-sampling model, Fock states are passed through a linear optics network and sampled via number-resolved photodetection. It has been shown that this sampling problem likely cannot be efficiently classically simulated. This raises the question as to whether there are other quantum states of light for which the equivalent sampling problem is also computationally hard. We present evidence that a very broad class of quantum states of light - arbitrary superpositions of two or more coherent states - when evolved via passive linear optics and sampled with number-resolved photodetection, likely implements a classically hard sampling problem.

New paper: Spontaneous parametric down-conversion photon sources are scalable in the asymptotic limit for boson-sampling

Full paper here.

Boson-sampling has emerged as a promising avenue towards post-classical optical quantum computation, and numerous elementary demonstrations have recently been performed. Spontaneous parametric down-conversion is the mainstay for single-photon state preparation, the technique employed in most optical quantum information processing implementations to-date. Here we present a simple architecture for boson-sampling based on multiplexed parameteric down-conversion and demonstrate that the architecture is limited only by the post-selected detection efficiency. That is, given that detection efficiencies are sufficiently high to enable post-selection, photon-number errors in the down-converters are sufficiently low as to guarantee correct boson-sampling most of the time. Thus, we show that parametric down-conversion sources will not present a bottleneck for future boson-sampling implementations. Rather, photodetection efficiency is the limiting factor and thus future implementations may continue to employ down-conversion sources.