Linear optics is a promising candidate for the implementation of quantum information processing protocols. In such systems single photons are employed to represent qubits. In practice, single photons produced from different sources will not be perfectly temporally and frequency matched. Therefore understanding the effects of temporal and frequency mismatch is important for characterising the dynamics of the system. In this paper we discuss the effects of temporal and frequency mismatch, how they differ, and what their effect is upon a simple linear optics quantum gate. We show that temporal and frequency mismatch exhibit inherently different effects on the operation of the gate. We also consider the spectral effects of the photo-detectors, focusing on time-resolved detection, which we show has a strong impact on the operation of such protocols.
Full paper here.